CODECHECK tackles one of the main challenges of computational research by supporting codecheckers with a workflow, guidelines and tools to evaluate computer programs underlying scientific papers. The independent time-stamped runs conducted by codecheckers will award a “certificate of executable computation” and increase availability, discovery and reproducibility of crucial artefacts for computational sciences. See the CODECHECK paper for a full description of problems, solutions, and goals and take a look at the GitHub organisation for examples of codechecks and the CODECHECK infrastructure and tools.
CODECHECK is based on five principles which are described in detail in the project description and the paper.
- Codecheckers record but don’t investigate or fix.
- Communication between humans is key.
- Credit is given to codecheckers.
- Workflows must be auditable.
- Open by default and transitional by disposition.
If you want to get involved as a codechecker in the community, or if you want to apply the CODECHECK principles in your journal or conference, please take a look at the Get Involved page.
To stay in touch with the project, follow us on social media at https://fediscience.org/@codecheck.
News
2024-04 | CODECHECK wins Team Credibility Prize 🏆
The British Neuroscience Association (BNA) have awarded the 2024 Team Credibility Prize to CODECHECK for our work on reproducibility. Further details.
2024-03 | Dutch Research Council (NWO) supports CODECHECK 🇳🇱
We are happy to announce a one-year grant from the Dutch Research Council (NWO) to support Codecheck activities in the Netherlands. The project is led by Frank Ostermann (University of Twente) with colleagues from Delft and Groningen. Further details.
2023-09 | CODECHECK and TU Delft Hackathon 💻
TU Delft and CODECHECK run a hackathon on 18th September 2023.
👉 Read the report in the TU Delft Open Publishing Blog: https://openpublishing.tudl.tudelft.nl/tu-delft-codecheck-hackathon-some-perspectives/ 👈
📓 The shared notes are available on the following pad: https://hackmd.io/77AIvx0qRRWGvo1D2k_t8A
The hybrid event is jointly organised by TU Delft Open Science, TU Delft OPEN Publishing, the CODECHECK team, and friends. The workshop features live codechecking of workflows by researchers from TU Delft and is both suitable for hands-on participation, observing, and discussing. The goal is to explore building a local CODECHECK community whose members may check each others code, e.g., before a preprint is published or a manuscript is submitted.
2022-11 | Introduction to CODECHECK Video 📺
Follow us on YouTube: https://www.youtube.com/@cdchck
2022-11 | Panel participation in “How to build, grow, and sustain reproducibility or open science initiatives”
CODECHECK team member Daniel Nüst had the honour to participate in a panel discussion on November 23rd 2022. The German Reproducibility Network (GRN) organised the two-day event “How to build, grow, and sustain reproducibility or open science initiatives: A virtual brainstorming event”. Learn more about the evenet and this asynchronous unconference-style meeting format on the website. The event was accompanied by a live and interactive and the panel discussion on the same topic. The panelists were representatives of the German Reproducibility Network (GRN) and actively involved in initiatives that focus on open science, open code, guidelines and research practices, as well as quality management, among other things.
Daniel thanks the other panelists for the interesting conversation: Carsten Kettner, Céline Heinl, Clarissa F. D. Carneiro, and Maximilian Frank. We also thank the organization team from GRN steering group (Antonia Schrader, Tina Lonsdorf, Gordon Feld) and moderator Tracey Weissgerber from BIH QUEST Center @ Charité Berlin.
2022-09 | CODECHECK Hackathon @ OpenGeoHub Summer School 🏫
Markus Konkol (https://github.com/MarkusKonk, https://twitter.com/MarkusKonkol), research software engineer at 52°North and codechecker, organised a CODECHECK hackathon as part of the OpenGeoHub summer school. He reports on his experiences in a blog post in the 52°North blog at https://blog.52north.org/2022/09/16/opengeohub-summer-school-facilitating-reproducibility-using-codecheck/. It’s great to see that codechecking is a suitable evening pastime activity and that participants took some nice learnigns away from the experience of codechecking. Check out the quotes in the blog post!
Thanks, Markus, for spreading the word about CODECHECK and for introducing more developers and software-developing researchers of the need for their expertise during peer review.
2022-06 | AGILE Reproducibility Review 2022
The collaboration between CODECHECK and the AGILE conference series continues! In 2022, the AGILE conference’s reproducibility committee conducted 16 reproductions of conference full papers. Take a look at the slides presented at the final conference day here. The reproducibility review took place after the scientific review. The reproducibility reports, the AGILE conference’s are published on OSF at https://osf.io/r5w79/ and listed in the CODECHECK register.
Learn more about the Reproducible AGILE initiative at https://reproducible-agile.github.io/.
2022-04 | CODECHECK talks 💬
The CODECHECK team is grateful about the continued interest from the research community on the topic of evaluating code and workflows as part of scholarly communication and peer review.
Stephen gave a talk at the 2022 Toronto Workshop on Reproducibility organised by Rohan Alexander. You can find the slides online and also watch the recording on YouTube - very worth a look because of the great Q&A at the end!
Stephen presented CODECHECK: An Open Science initiative for the independent execution of computations underlying research articles during peer review to improve reproducibility (slides) in May 2021 at the Reproducibility Tea Southhampton.
Daniel gave the keynote at the Collaborations Workshop 2022 (CW22) on April 4, 2022, organised by the Software Sustainability Institute (SSI), UK entitled Code execution during peer review (slides, PDF, video) and presented CODECHECK as well as the partnering initiative Reproducible AGILE.
2021-07 | F1000Research paper on CODECHECK published after reviews 📃
The F1000Research preprint presented below has passed peer review and is now published in version 2. We are grateful for the two reviewers, Nicolas P. Rougier and Sarah Gibson, who gave helpful feedback and asked good questions that helped to improve the paper.
Nüst D and Eglen SJ. CODECHECK: an Open Science initiative for the independent execution of computations underlying research articles during peer review to improve reproducibility [version 2; peer review: 2 approved]. F1000Research 2021, 10:253 (https://doi.org/10.12688/f1000research.51738.2)
The F1000 blog also features the article with a little Q&A: https://blog.f1000.com/2021/09/27/codecheck. Thanks Jessica for making that happen!
2021-04 | CODECHECK @ ITC
CODECHECK supporter Markus Konkol has built a CODECHECK process for all researchers at the University of Twente’s Faculty of Geo-Information Science and Earth Observation (ITC). He offers his expertise to codecheck manuscripts and underlying source code and data before submission or preprint publication, so even if the information is still not publicly shared. His reports will then go public on Zenodo when the paper comes out, just like a regular CODECHECK, and can support the article’s claims. If timed right, authors can even link to the certificate before submission. This is a great service for ITC researchers and their reviewers and readers!
Learn more at https://www.itc.nl/research/open-science/codecheck/ and see an example at https://doi.org/10.5281/zenodo.5106408.
2021-03 | F1000Research preprint
A preprint about CODECHECK was published at F1000Research and is now subject to open peer review. It presents the codechecking workflow, describes involved roles and stakeholders, presents the 25 codechecks conducted up to today, and details the experiences and tools that underpin the CODECHECK initiative. We welcome your comments!
Nüst D and Eglen SJ. CODECHECK: an Open Science initiative for the independent execution of computations underlying research articles during peer review to improve reproducibility [version 1; peer review: awaiting peer review]. F1000Research 2021, 10:253 (https://doi.org/10.12688/f1000research.51738.1)
2020-06 | Nature News article
A Nature News article by Dalmeet Singh Chawla discussed the recent CODECHECK #2020-010
of a simulation study, including some quotes by CODECHECK Co-PI Stephen J. Eglen and fellow Open Science and Open Software experts Neil Chue Hong (Software Sustainability Institute, UK) and Konrad Hinsen (CNRS, France).
Singh Chawla, D. (2020). Critiqued coronavirus simulation gets thumbs up from code-checking efforts. Nature. https://doi.org/10.1038/d41586-020-01685-y
2019-11 | MUNIN conference presentation
Stephen Eglen presented CODECHECK at The 14th Munin Conference on Scholarly Publishing 2019 with the submission “CODECHECK: An open-science initiative to facilitate sharing of computer programs and results presented in scientific publications”, see https://doi.org/10.7557/5.4910.
Take a look at the poster and the slides.
Citation and sharing
To cite CODECHECK in scientific publications, please use the following citation/reference:
Eglen, S., & Nüst, D. (2019). CODECHECK: An open-science initiative to facilitate the sharing of computer programs and results presented in scientific publications. Septentrio Conference Series, (1). https://doi.org/10.7557/5.4910
To get or give a quick overview of the project, feel free to use or extend the existing slide decks.