R01 and R21 Applications & Awards: Trends and Relationships Across NIH

As described on our grants page, the R21 activity code “is intended to encourage exploratory/developmental research by providing support for the early and conceptual stages of project development.” NIH seeks applications for “exploratory, novel studies that break new ground,” for “high-risk, high-reward studies,” and for projects that are distinct from those that would be funded by the traditional R01. R21 grants are short duration (project period for up to 2 years) and lower in budget than most R01s (combined budget over two years cannot exceed $275,000 in direct costs). NIH institutes and centers (ICs) approach the R21 mechanism in variable ways: 18 ICs accept investigator-initiated R21 applications in response to the parent R21 funding opportunity, while 7 ICs only accept R21 applications in response to specific funding opportunity announcements. As mentioned in a 2015 Rock Talk blog, we at NIH are interested in trends in R01s in comparison to other research project grants, so today I’d like to continue and expand on looking at R01 and R21 trends across NIH’s extramural research program. …. Continue reading

Email this to someoneTweet about this on TwitterShare on FacebookShare on LinkedInShare on Google+Pin on PinterestPrint this page

Are You On the Fence About Whether to Resubmit?

When applicants receive their summary statement resulting from the review of an application that was assigned a score outside of the ICs funding range, there are important decisions to be made that, ideally, should be based upon evidence. What is the likelihood that an application like this one will be funded? If I resubmit the application, what changes might improve the chances for a successful resubmission?

Recall that in 2014, NIH relaxed its resubmission policy (OD-14-074) to allow applicants to submit a new (A0) application following an unsuccessful resubmission application. Also, we recently posted a piece showing that review outcomes for new applications submitted following an unsuccessful resubmission had about the same funding success as other new applications. But some applicants may wonder, what is the funding success for a resubmission application? …. Continue reading

Email this to someoneTweet about this on TwitterShare on FacebookShare on LinkedInShare on Google+Pin on PinterestPrint this page

Applying the Relative Citation Ratio as a Measure of Grant Productivity

Last April we posted a blog on the measurement of citation metrics as a function of grant funding. We focused on a group of R01 grants and described the association of a “citation percentile” measure with funding. We noted evidence of “diminishing returns” – that is increased levels of funding were associated with decreasing increments of productivity – an observation that has been noted by others as well.

We were gratified by the many comments we received, through the blog and elsewhere. Furthermore, as I noted in a blog last month, our Office of Portfolio Analysis has released data on the “Relative Citation Ratio,” (or RCR) a robust field-normalized measure of citation influence of a single grant (and as I mentioned, a measure that is available to you for free).

In the follow-up analysis I’d like to share with you today, we focus on a cohort of 60,447 P01 and R01-equivalent grants (R01, R29, and R37) which were first funded between 1995 and 2009. Through the end of 2014, these grants yielded at least 654,607 papers. We calculated a “weighted RCR” value for each grant, …. Continue reading

Email this to someoneTweet about this on TwitterShare on FacebookShare on LinkedInShare on Google+Pin on PinterestPrint this page

Measuring Impact of NIH-supported Publications with a New Metric: the Relative Citation Ratio

In previous blogs, we talked about citation measures as one metric for scientific productivity. Raw citation counts are inherently problematic – different fields cite at different rates, and citation counts rise and fall in the months to years after a publication appears. Therefore, a number of bibliometric scholars have focused on developing methods that measure citation impact while also accounting for field of study and time of publication. We are pleased to report that on September 6, PLoS Biology published a paper from our NIH colleagues in the Office of Portfolio Analysis on “The Relative Citation Ratio: A New Metric that Uses Citation Rates to Measure Influence at the Article Level.” Before we delve into the details and look at some real data, …. Continue reading

Email this to someoneTweet about this on TwitterShare on FacebookShare on LinkedInShare on Google+Pin on PinterestPrint this page

Model Organisms, Part 3: A Look at All RPGs for Six Models

We are most appreciative of the feedback we’ve received, through the blog and elsewhere, on NIH support of model organism research. In part 1 of this series, we mentioned that we asked two separate groups to analyze NIH applications and awards. In parts 1 and 2 we primarily focused on R01-based data that were curated and analyzed by our Office of Portfolio Analysis. In part 3, we show results from a broader range of research project grant (RPG) data that were prepared and analyzed by our Office of Research Information Systems. This group used an automated thesaurus-based text mining system which delves into not only public data such as project titles, abstracts, public health relevance statements, but also the specific aims contained in RPG applications. …. Continue reading

Email this to someoneTweet about this on TwitterShare on FacebookShare on LinkedInShare on Google+Pin on PinterestPrint this page

The Predictive Nature of Criterion Scores on Impact Score and Funding Outcomes

In order to develop and implement data-driven policy, we need to carefully analyze our data to understand the “stories behind our metrics.” Without analyzing our data to know what’s going on, we’re essentially flying blind! A group of authors from the NIH Office of Extramural Research sought to investigate the stories behind peer review scoring and why some grant applications are more likely to be funded than others. They extended analyses previously reported by NIH’s Office of Extramural Research and National Institute of General Medical Studies. Last month, they published their analysis of over 123,000 competing R01 applications and described the correlations of individual component peer review scores – significance, investigator(s), innovation, approach, and environment – with subsequent overall impact score, and funding outcome. …. Continue reading

Email this to someoneTweet about this on TwitterShare on FacebookShare on LinkedInShare on Google+Pin on PinterestPrint this page

A Look at Trends in NIH’s Model Organism Research Support

Wangler, et al. recently published an article in Genetics on NIH funding for model organism research involving Drosophila. The authors extracted grant information from NIH ExPORTER and looked for the word “Drosophila” in either the title or abstract. By this approach the authors found that NIH support for Drosophila-based research is declining.

We chose to investigate further trends in NIH support for Drosophila and other model organism research. Two groups of NIH staff used two different approaches. Our Office of Research Information Systems (ORIS) used an automated thesaurus-based text mining system which mines not only project titles and abstracts but also the specific aims contained in the application; this is the system we use to generate “Research Condition and Disease Category” (or RCDC) tables, which are publicly posted to the NIH RePORT website. In a separate effort, our Office of Portfolio Analysis (OPA) supplemented a different text mining algorithm with extensive manual curation. Both methods – the wholly automated thesaurus-based text mining approach and the manual curation supplemented text mining approach – yielded similar findings. In this blog, we will present the results of the manually curated approach. …. Continue reading

Email this to someoneTweet about this on TwitterShare on FacebookShare on LinkedInShare on Google+Pin on PinterestPrint this page

Outcomes for R01 “Virtual A2s”

A few months ago, a researcher told me about his experiences with the relatively new NIH policy by which investigators are allowed to submit what we have come to call “virtual A2s.” Under NIH’s previous single resubmission policy, if an investigator’s de novo R01 grant application (called an “A0”) was not funded, they had one chance to submit a revision (called an “A1”). If the A1 application was unsuccessful, the applicant was required to make significant changes in the application compared to the previous submissions. NIH took measures to turn away subsequent submissions that were materially similar to the unfunded A1. Under NIH’s current policy, investigators may resubmit a materially similar application as a new submission after the A1 submission. We will call these applications “virtual A2s.” The researcher told me that his virtual A2 did not fare well; although his A0 and A1 had received good scores (though not good enough for funding), the virtual A2 was not discussed. He wondered, just how likely is it for a virtual A2 to be successful? …. Continue reading

Email this to someoneTweet about this on TwitterShare on FacebookShare on LinkedInShare on Google+Pin on PinterestPrint this page

How Many Researchers are Seeking SBIR/STTR Funding?

We were pleased to see the interest in our recent blog on the unique number of investigators applying for and receiving NIH research project grants (RPGs). Some of you (through the blog page or through other media) have asked about whether we have similar data for our Small Business Innovation Research (SBIR) and Small Business Technology Transfer (STTR) program. We have generated analogous figures for SBIR and STTR grants, and today’s post shares this investigation of the question, “How many unique researchers are seeking SBIR/STTR funding?” …. Continue reading

Email this to someoneTweet about this on TwitterShare on FacebookShare on LinkedInShare on Google+Pin on PinterestPrint this page