Home » Posts » Conducting transparent, accessible research as graduate students: Myths about Open Science

Conducting transparent, accessible research as graduate students: Myths about Open Science

Professional headshot of Yolanda Yang, a graduate student at UNC Chapel Hill, smiling in front of a clean white backdrop.
Yolanda Yang, M.A., UNC Chapel Hill

Open science in psychology refers to transparent and collaborative practices prioritizing more accessibility and reproducibility in research processes. In the past decade, pre-registration has soared from a few-known initiatives to thousands per year (Simmons et al., 2021). Open science has gained popularity because researchers can frequently use this practice to gain more citations, form more collaborations, or develop their professional network (Colavizza et al., 2020). This transition comes with great benefits but also brings steep learning curves and unexpected challenges to those who carry out the research, usually early career researchers (ECRs). In this newsletter, we are starting a discussion from the perspectives of ECRs surrounding some “common myths” about open science, such as the risk of data being “scooped,” the possibility of impacting the chance of publication, and the ethical concerns of data sharing.

Myth 1: “A pre-registration will bind me from doing exploratory analysis or changing my study.”

Many believe these new approaches have changed how science is conducted. They effectively discourage hypothesizing after the results are known (HARKing; Kerr, 1998) by requiring authors to declare their hypotheses and analysis plan before data collection. Pre-registration prevents authors from fishing out their results and encourages researchers (both authors and readers) to clearly and transparently distinguish between confirmatory hypothesis testing and post hoc exploratory analyses (Humphreys et al., 2017). 

Although “pre-registration” can sound like a binding contract, thankfully, that is not true. Depending on your stage in the research cycle, when you need to make changes, you can either create a new registration or update as time-stamped changes. For instance, if you have already started collecting data, changing your research questions on pre-registration would not be reasonable. Alternatively, you may adapt your protocol according to your needs and clearly state these changes when you write it up later.

Myth 2: “It’s unethical to share the data I collected from human subjects.”

A warranted concern about data-sharing is the privacy of human research subjects who provide data. As open science practices became more common, many of us wondered if it would be ethical to share data collected from human participants (Van Horn & Gazzaniga, 2013). The first consideration should be whether you stated your intention to share data in the consent process. As long as the sharing was indicated in the consent forms provided for participants, you may certainly share after participants grant permission. However, if you did not state the data-sharing in your consent form, you should contact your participants to ensure they are okay with shared data. Researchers have yet to fully reach a consensus on regulatory mechanisms for data-sharing in the context of open access (Choudhury et al., 2014). For now, we recommend thinking ahead during the project designing phase and including appropriate clauses in the con­sent forms and ethics protocols that describe both your in­tent and your plan for ethically sharing the raw data. For instance, you may consider using a broad or open-ended consent, which lists an unspecified range of future research subjects and informs participants of the unknown nature and intent of future uses of the data (Grady et al., 2015). 

Myth 3: “What if somebody else scoops my data or steals my ideas?”

Many researchers who publicly share datasets believe such openness can facilitate research integrity self-regulation. However, worries about data being “scooped” remains the most commonly asked concern regarding data sharing (Laine, 2017). 

It’s important to note that all preprints can be posted with a date/time stamp. This timestamp will allow you to claim your intellectual property as you are the first person to post this work. Since preprints are tied to unique DOIs and time stamps when posted correctly, the author argues that preprints can prevent scooping because you get to prove your intellectual property ownership by establishing the precedence of your work and showing a clear record (Ettinger et al., 2022). Therefore, ensuring your preprints are posted along with a clear, earliest temporal record is critical. 

Some may also say that other researchers would write up a secondary analysis using our data without acknowledging our hard work collecting this data. Even though we recommend preregistering all planned analyses from the get-go and indicating these plans in the data documentation, we understand that ideas evolve! We have a few suggestions to mitigate this problem. You may delay sharing the full dataset until you have answered all planned questions. Instead, you may only share the variables and data you have used for the primary outcomes you have analyzed. Second, it is also common to share a codebook but restrict access to the full dataset and provide a procedure to grant access to the full data (e.g., request others directly reach out to you). One advantage of this approach is that it encourages interested parties to reach out directly and promotes transparency and collaboration. In addition, some researchers choose to apply for a CC BY license to the data set; anyone using the data must at­tribute the data to the given researcher (e.g., through citing the associated paper or using a DOI for the OSF project; Moshontz et al., 2021).

Myth 4:  “If my manuscript is already out in the world as a preprint, journals won’t want to publish it.”

The truth is, it depends. Different journals have different policies. We recommend looking up journal guidelines – this information should always be easily accessible. Looking through  the restrictions and rules will likely indicate if posting of preprints pre-submission is allowed (usually yes). You may also reach out to the editor and inquire about this information if you cannot find it online.

Similarly, journals may have different guidelines about whether it is acceptable to cite preprints. We do need to keep in mind preprints should be observed exclusively as works in progress that were posted to receive open feedback. They have not gone through the formal and accountable peer review process (Añazco et al., 2021). Regardless, you will have to make the judgment call of quality control. 

Myth 5:  “Pre-registering is so much work, it’s taking me forever!” 

Yes, pre-registration seems to be extra work and takes time, but you can always weigh the time spent upfront and the time saved later! A preregistration is an organized version of your study design. It may take time to get your advisor and collaborators to read through your preregistration, but it will eventually save time as everyone gets to discuss intricacies from the very beginning. Therefore, it will be easier to troubleshoot if any issues arise during data collection. In summary, time commitment is required, but it can be a solid return on investment for some.

Conclusion

In embracing the principles of open science, our exploration of common myths surrounding practices like pre-registration, data-sharing, and preprints reveals the evolving landscape of psychological research. While challenges exist, particularly for early career researchers, the shift towards transparency and collaboration holds great promise. Therefore, the author encourages fellow students to foster a culture that values both innovation and ethical considerations, ensuring the advancement of our collective understanding in the field .

Reference

Añazco, D., Nicolalde, B., Espinosa, I., Camacho, J., Mushtaq, M., Gimenez, J., & Teran, E. (2021). Publication rate and citation counts for preprints released during the COVID-19 pandemic: The good, the bad and the ugly. PeerJ, 9, e10927. https://doi.org/10.7717/peerj.10927

Colavizza, G., Hrynaszkiewicz, I., Staden, I., Whitaker, K., & McGillivray, B. (2020). The citation advantage of linking publications to research data. PLOS ONE, 15(4), e0230416. https://doi.org/10.1371/journal.pone.0230416 

Ettinger, C. L., Sadanandappa, M. K., Görgülü, K., Coghlan, K. L., Hallenbeck, K. K., & Puebla, I. (2022). A guide to preprinting for early-career researchers. Biology Open, 11(7), bio059310.

Grady, C., Eckstein, L., Berkman, B., Brock, D., Cook-Deegan, R., Fullerton, S. M., Greely, H., Hansson, M. G., Hull, S., Kim, S., Lo, B., Pentz, R., Rodriguez, L., Weil, C., Wilfond, B. S., & Wendler, D. (2015). Broad Consent for Research With Biological Samples: Workshop Conclusions. The American Journal of Bioethics, 15(9), 34–42. https://doi.org/10.1080/15265161.2015.1062162 

Humphreys, M., Sanchez de la Sierra, R., & Van der Windt, P. (2017). Fishing, Commitment, and Communication: A Proposal for Comprehensive Nonbinding Research Registration. Pollitical Analysis, 27(1). https://www-cambridge-org.libproxy.lib.unc.edu/core/journals/political-analysis/article/fishing-commitment-and-communication-a-proposal-for-comprehensive-nonbinding-research-registration/BD935F7843BF07F338774DAB66E74E3C 

Kerr, N. L. (1998). HARKing: Hypothesizing after the results are known. Personality and social psychology review, 2(3), 196-217.

Laine, H. (2017). Afraid of Scooping – Case Study on Researcher Strategies against Fear of Scooping in the Context of Open Science. Data Science Journal, 16, 29–29. https://doi.org/10.5334/dsj-2017-029 

Moshontz, H., Binion, G., Walton, H., Brown, B. T., & Syed, M. (2021). A guide to posting and managing preprints. Advances in Methods and Practices in Psychological Science, 4(2), 25152459211019948.

Simmons, J., Nelson, L., & Simonsohn, U. (2021). Pre-registration: Why and How. Journal of Consumer Psychology, 31(1), 151–162. https://doi.org/10.1002/jcpy.1208 Van Horn, J. D., & Gazzaniga, M. S. (2013). Why share data? Lessons learned from the fMRIDC. NeuroImage, 82, 677–682. https://doi.org/10.1016/j.neuroimage.2012.11.010