Dr Reny Baykova is a lecturer at the School of Psychology, University of Sussex, interested in improving open science and reproducibility practices in quantitative research.

Quantitative research findings are relevant only if they are computationally reproducible – rerunning the same analysis on the same dataset should yield the same numerical results, figures, and inferential conclusions. Studies have found that across research fields, only about a quarter to a third of published research papers are computationally reproducible (Baker, 2016; Stodden et al., 2018; Crüwell et al., 2023). Such statistics erode confidence in science and require that we establish new processes to ensure research integrity.

Since March 2023 the School of Psychology at the University of Sussex has offered researchers the opportunity to have an independent statistician (me) check and certify the computational reproducibility of papers before they are submitted for publication. Originally the project was funded through the Research England Enhancing Research Culture Fund and supervised by Prof Zoltan Dienes. Now, due to the success seen, the statistician post is core funded by the University.

The scheme is open to researchers affiliated with the School of Psychology at the University of Sussex and focuses on quantitative research. Participation is voluntary, and my key aim is to help researchers gain the skills to conduct reproducible research, and to create a demand for more rigorous reproducibility practices coming from researchers themselves.

When researchers decide to submit their paper to a reproducibility check, I examine the shared materials, point out any results that do not reproduce, and provide suggestions on how to improve the reproducibility of the study. After the researchers have updated the materials and manuscript, I complete and upload a final reproducibility report online. Researchers can then highlight that their paper has been certified as computationally reproducible and reference the reproducibility report in their manuscript, thus setting them apart in terms of transparency and rigour in the eyes of editors, reviewers, and readers – for an example, see Cabbai et al. (2023).

Psychology researchers at Sussex have shown keen interest in improving the reproducibility of their work. So far, I have worked on 9 projects spanning a variety of research topics, for example – imagery, phenomenological control, implicit learning, carbon footprint of imaging pre-processing pipelines. I am not an expert in any of these topics, and in some cases I was unfamiliar with some of the statistical tests or software packages used in different analyses.  This doesn’t feel like a limitation to me because I approach the process as a collaboration – if there is something I can’t wrap my head around, the researchers whose work I am evaluating are there to help me (as much as I am there to help them).

The approach to reproducibility we take at Sussex is just one of many different possibilities. For example, a team at the University of York explored the potential of offering reproducibility “as a service” – researchers would send their papers to the team, and they would convert them into a reproducible format (Baker et al., 2023). Journals themselves could also pose reproducibility as a criterion for publication, for example Metapsychology and very recently Psychological Science, which would offer an additional incentive to researchers to consider the reproducibility of their work. Ultimately, our long-term aim is that as more and more researchers recognize the benefits of submitting their studies to a reproducibility check, the practice will become a standard part of the research process and help strengthen confidence in the integrity of science.

Author:

Dr Reny Baykova
Lecturer in Psychological Methods (Reproducibility)
University of Sussex

References:
Baker, M. (2016). 1,500 scientists lift the lid on reproducibility. Nature, 533, 452-454.

Baker, D., Berg, M., Hansford, K., Quinn, B., Segala, F. G., & English, E. (2023). ReproduceMe: lessons from a pilot project on computational reproducibility.

Cabbai, G., Dance, C., Dienes, Z., Simner, J., Forster, S., & Lush, P. (2023). Investigating relationships between trait visual imagery and phenomenological control: the role of context effects

Crüwell, S., Apthorp, D., Baker, B. J., Colling, L., Elson, M., Geiger, S. J., … & Brown, N. J. (2023). What’s in a badge? A computational reproducibility investigation of the open data badge policy in one issue of Psychological Science. Psychological science, 34(4), 512-522.

Stodden, V., Seiler, J., & Ma, Z. (2018). An empirical analysis of journal policy effectiveness for computational reproducibility. Proceedings of the National Academy of Sciences, 115(11), 2584-2589.

Image by Gerd Altmann from Pixabay