Johan Jansson (email@example.com), KTH
Ridgway Scott, University of Chicago
Rebecca Durst, University of Pittsburgh
Problem: Publications are not reproducible
KI President Ottersen describes the problem concisely;
We are in the midst of what some have called a research reproducibility crisis. While scientific discovery and complexity are developing at an unprecedented speed, less than 50% of scientific research studies can be reliably replicated. Left unchecked, this troubling fact may threaten our ability to generate sound, evidence-based knowledge that meets society’s needs. It is time to look beyond the traditional measures of quality and re-examine the very concept of quality itself.
NASA Transform to Open Science (TOPS):
"Lowering barriers to entry for historically excluded communities"
"Increasing opportunities for collaboration while promoting scientific innovation, transparency, and reproducibility.
Illustration: In school one may not only give the answer to a problem - "show your work!".
In research it is often not possible to see or reproduce how the answer was derived or constructed. Why is this so?
"Reproducibility of scientific results in the EU", Directorate-General for Research and Innovation (European Commission), 2020
Excerpts from the EU report:
Second, there is a perceived deliberateness, or at least carelessness, in scientific production due to competitive pressures. A growing proportion of scientists are perceived as – willingly or unwittingly – bending some of the basic premises of the scientific method to produce ‘fast science’ or even ‘make believe science’ – facts and theories that are declared true but are dubious or even false. This rests more on the structure of incentives of science-making, embedded in culture and practice, than on deliberate attempts to ‘cheat’. The need for results to be reproducible, and the tangible steps needed to make them so, may help results be trustworthy and keep scientists honest.
Sharing of data, protocols, materials, software, codes, and other tools underlying publications; Transparency of analysis and modelling;
17. Fund the testing and R&I development of automatic systems of compliance for reproducibility before publication;
24. Ensure that Horizon Europe provisions encourage and support
reproducibility (see list of possible actions, above);
25. Employ and police guidelines early in the grant application phase to anchor journal practices;
Reproducibility in the digital age
Lorena Barba, a professor at George Washington University in Washington, D. C., says in Physics World:
What we are calling for is changing those norms to give importance to the full set of digital objects that are part of a scientific study and acknowledging that the scientific paper is insufficient today in its methods section to include all of the information needed for another researcher to confirm the results or build from those results.
The technology exists to achieve this, there have been technical solutions since the 80s and 90s.
In the US there are now guidelines for requiring the publication of the "digital objects" (Open Source), in the US National Academies of Sciences, Engineering and Medicine. Professor Barba has been a leader in these developments.
Zenodo (https://en.wikipedia.org/wiki/Zenodo) has become a standard resource in science for publishing “data sets”. For each submission, a persistent digital object identifier (DOI) is minted, which makes the stored items easily citeable. Zenodo is based on the Open Source project Invenio.
KTH Library is active in developing an Invenio/Zenodo-framework for supporting reproducibility.
Reproducibility in scientific modeling
With Invenio/Zenodo, DOIs can be acquired for both the source code and generated data for a scientific model, allowing this material to be easily cited. The material may then be shared while avoiding questions about ownership of the intellectual property.
However, just publishing a “data set” or even an archive of the source code, does not guarantee or make scientific results reproducible. It may still take an enormous effort to actually re-run the computations (e.g. lacking familiarity with required software, access to computing resources, etc.), and you do not know before you invest that effort how reproducible the results are (e.g. limited or missing methodology sections).
Reproducibility requires transparency. A lack of transparency in experiments creates a barrier to inclusivity and accessibility in science.
We present the Digital Math framework as the foundation for modern science based on constructive digital mathematical computation.
Ubiquitous Computing: "Copy a lab"
Easily accessible - “copy our lab at zero cost - rerun experiment in seconds 1-click in web browser”
Advantage of digital and simulation over experiments
with interactive editable computation with Jupyter/FEniCS:
FEniCS open source FEM framework for automated solution of general mathematical models (PDE). We started FEniCS 2003, today de-facto world-standard for mathematical FEM with 100s co-authors at highest level in academia.