Scientists have long considered the research process to be self-correcting; we trust that, even if scientists may sometimes make errors in the lab, those errors will eventually be discovered and corrected as others try to substantiate and extend original research findings. However, as stated in a commentary by NIH Director Francis Collins and NIH Deputy Director Larry Tabak, “A growing chorus of concern, from scientists and laypeople, contends that the complex system for ensuring the reproducibility of biomedical research is failing and is in need of restructuring.”
There are examples that indicate that our processes have room for improvement. For example, a 2008 study by the Amyotrophic Lateral Sclerosis (ALS) Therapy Development Institute examined the preclinical effects of more than 70 drugs on the SOD1 mouse model of ALS, and showed that the probability of seeing an effect by chance alone is significant even with only ten animals per group. A 2011 study in Clinical Cancer Research authenticated 122 head and neck cancer cell lines, finding that 37 of them were misidentified and/or contaminated. Many articles publish results from experiments only using animals of one sex; however, a 2005 study in the Journal of Cerebral Blood Flow & Metabolism found that male and female mice responded differently to treatment with the same inhibitor, which was only evident when reviewing data disaggregated by sex.
Could we be doing better? For the past several years, NIH has been engaging in many conversations with its stakeholders to discuss how the research community can collectively address issues of scientific rigor and transparent reporting, with the ultimate goal of producing the highest quality science and preserving public trust in the research process. As you may recall, in the summer of 2014 NIH held a workshop to discuss the issue of reproducibility and rigor of research findings with scientific journal editors. The goal of this meeting was to identify opportunities in the scientific publishing arena to support research that is reproducible, robust, and transparent. Since then, many journal editors have come to a consensus on a set of principles for reporting preclinical research, focusing on rigorous and transparent reporting of study design elements (e.g., statistics, randomization, blinding), data and material sharing, consideration of refutations, and more.
While publishing standards are an important step in the right direction, we at NIH wanted to encourage the research community to think about scientific rigor and transparency throughout the life-cycle of the research process, beginning with the planning of research projects. By clarifying NIH’s expectations regarding rigor and transparency and how we would like to see these elements described in grant applications, we hope to raise community awareness about these critical issues, continuing efforts previously described by Dr. Sally Rockey & Dr. Larry Tabak. Through extensive trans-NIH discussions, we have refined our application instructions, progress reporting, and review language to reflect the following four areas: scientific premise of the proposed research; rigor of experimental design; consideration of relevant biological variables such as sex; and authentication of biological and chemical resources. We hope that our changes will prompt applicants to consider experimental design elements or variables that they may have previously overlooked or may have not reported. Of course, we are very cognizant of concerns around administrative burden for both our applicants and reviewers, and we are implementing these changes in a way that minimizes burden.
Last week, we published updated application instructions and review questions that apply to research grant and career development applications with due dates of January 25, 2016 and beyond. Specifically, these updates include:
- New research strategy instructions
- A separate attachment for you to discuss authentication of key biological and/or chemical resources
- Additional rigor and transparency questions reviewers will be asked to consider when reviewing applications
We encourage you to read these notices carefully and consider how the updated instructions will change how you think about your project and prepare your grant application. There will also be updated instructions for fellowships and training grants published in the NIH Guide in the coming months.
Will this emphasis on rigor stifle innovation and creativity? We’ve heard concerns echoing this idea, but we don’t think that this should be the case. Innovative ideas should still be grounded by a scientific premise, and innovative science should be designed, executed, and reported in a rigorous and transparent way. Creative approaches to doing research can be just as rigorous, or even more so, than what’s currently seen as standard practice. One example can be taken from the world of clinical trials, where randomized registry trials preserve the rigor of randomization and large sample sizes, but, at the same time, offer a way to conduct large-scale trials orders of magnitude more efficiently than standard practice.
Our office has made many resources, including extramural staff training material, available on the OER webpage for rigor & reproducibility. We hope you find these resources helpful, and agree that these changes are a step in the right direction to uphold the public’s faith in scientific research.
It is our intent to foster more productive science, and better science, while maintaining enough flexibility for innovation and creativity. After all, nothing in science can be considered useful unless we can trust that it was held to rigorous standards.