American Innovation and Competitiveness Act was adopted unanimously by the U.S. Congress and signed into law by president Obama in January 2017.

The law contains a section called Research Reproducibility and Replication, which asked the Director of the National Science Foundation in agreement with the National Research Council to prepare a report on issues related to research reproducibility and “to make recommendations for improving rigor and transparency in scientific research”.

To fulfill this requirement, a consensus report of the National Academies of Sciences, Engineering, and Medicine was published in 2019. The report is summarized in the special issue of Harvard Data Science Review in December 2020.

Among the recommendations:

All researchers should include a clear, specific, and complete description of how the reported results were reached. Reports should include details appropriate for the type of research, including:

  • a clear description of all methods, instruments, materials, procedures, measurements, and other variables involved in the study;
  • a clear description of the analysis of data and decisions for exclusion of some data or inclusion of other;
  • for results that depend on statistical inference, a description of the analytic decisions and when these decisions were made and whether the study is exploratory or confirmatory;
  • a discussion of the expected constraints on generality, such as which methodological features the authors think could be varied without affecting the result and which must remain constant;
  • reporting of precision or statistical power; and
  • discussion of the uncertainty of the measurements, results, and inferences.

Funding agencies and organizations should consider investing in research and development of open-source, usable tools and infrastructure that support reproducibility for a broad range of studies across different domains in a seamless fashion. Concurrently, investments would be helpful in outreach to inform and train researchers on best practices and how to use these tools.

Journals should consider ways to ensure computational reproducibility for publications that make claims based on computations, to the extent ethically and legally possible.