Authors: Leonie Dudda, Eva Kormann, Magdalena Kozula, Nicholas J. DeVito, Thomas Klebe, Ayu P.M. Dewi, René Spijker, Inge Stegeman, Veerle Van den Eynden, Tony Ross-Hellauer, Mariska M.G. Leeflang
Abstract: Obtained from CrossRef
Various interventions – especially those related to open science – have been proposed to improve the reproducibility and replicability of scientific research. To assess whether and which interventions have been formally tested for their effectiveness in improving reproducibility and replicability, we conducted a scoping review of the literature on interventions to improve reproducibility. We systematically searched Medline, Embase, Web of Science, PsycINFO, Scopus and Eric, on August 18, 2023. Grey literature was requested from experts in the fields of reproducibility and open science. Any study empirically evaluating the effectiveness of interventions aimed at improving the reproducibility or replicability of scientific methods and findings was included. An intervention could be any action taken by either individual researchers or scientific institutions (e.g., research institutes, publishers and funders). We summarized the retrieved evidence narratively and in an evidence gap map. Of the 104 distinct studies we included, 15 directly measured the effect of an intervention on reproducibility or replicability, while the other research questions addressed a proxy outcome that might be expected to increase reproducibility or replicability, such as data sharing, methods transparency or preregistration. Thirty research questions within included studies were non-comparative and 27 were comparative but cross-sectional, precluding any causal inference. Possible limitations of our review may be the search and selection strategy, which was done by a large team including researchers from different disciplines and different expertise levels. Despite studies investigating a range of interventions and addressing various outcomes, our findings indicate that in general the evidence-base for which various interventions to improve reproducibility of research remains remarkably limited in many respects.
Certificate identifier: 2024-004
Codechecker name: Sam Langton
Time of codecheck: 2024-08-01 10:00:00
Repository: https://github.com/codecheckers/scope
Codecheck report: https://doi.org/10.5281/zenodo.13364677
Summary:
Codecheck performed on two .qmd files containing R code from a public GitHub repository for a scoping review pre-print.
https://codecheck.org.uk/ | codecheckers
Published under CC BY-SA 4.0
CODECHECK is a process for independent execution of computations underlying scholarly research articles.