Resubmissions Revisited: Funded Resubmission Applications and Their Initial Peer Review Scores

February 17, 2017

“My first submission got an overall impact score of 30. Is that good enough? What’s the likelihood I’ll eventually get this funded?”, or, “My first submission was not even discussed. Now what? Does anyone with an undiscussed grant bother to resubmit? And what’s the likelihood I’ll eventually get this funded?” In a past blog we … Continue reading “Resubmissions Revisited: Funded Resubmission Applications and Their Initial Peer Review Scores”

Following up on the Research Commitment Index as a Tool to Describe Grant Support

February 15, 2017

Many thanks for your terrific questions and comments to last month’s post, Research Commitment Index: A New Tool for Describing Grant Support. I’d like to use this opportunity to address a couple of key points brought up by a number of commenters; in later blogs, we’ll focus on other suggestions.

The two points I’d like to address here are: 1) why use log-transformed values when plotting output (annual weighted relative citation ratio, or annual RCR) against input (annual research commitment index, or annual RCI), and 2) what is meant by diminishing returns. ….

Research Commitment Index: A New Tool for Describing Grant Support

January 26, 2017

On this blog we previously discussed ways to measure the value returned from research funding. Several of my colleagues and I, led by NIGMS director Jon Lorsch – chair of an NIH Working Group on Policies for Efficient and Stable Funding – conceived of a “Research Commitment Index,” or “RCI.” We focus on the grant activity code (R01, R21, P01, etc) and ask ourselves about the kind of personal commitment it entails for the investigator(s). We start with the most common type of award, the R01, and assign it an RCI value of 7 points. And then, in consultation with our NIH colleagues, we assigned RCI values to other activity codes: fewer points for R03 and R21 grants, more points P01 grants.

Characteristics and Outcomes of R01 Competing Renewal Applications (“Type 2s”)

December 22, 2016

An investigator’s long-term success depends not only on securing funding, but on maintaining a stable funding stream. One way to assure continued funding is to submit a competing renewal application. However, as we noted earlier this year, while new investigators were almost as successful as experienced investigators in obtaining new (type 1) R01s, the difference between new investigator and experienced investigator success rates widens when looking at competing renewals (type 2s), and success rates of new investigators’ first renewals were lower than those of experienced investigators. In addition, we know that since the end of NIH’s budget doubling in 2003, success rates for competing renewals of research project grants overall have decreased. To further understand trends in success rate for R01 competing renewals (“type 2s”) I’d like to share some additional analyses where we look at characteristics of type 2 R01 applications, and the association of their criterion scores with overall impact score and funding outcomes.

How Many Researchers Were Supported by NIH as Trainees?

November 28, 2016

Earlier this year we reported on the unique numbers of research project grant (RPG) awardees and applicants each year since the end of the NIH doubling, in 2003. We described how the number of unique RPG awardees has remained relatively constant, while the number of applicants (as assessed over 5-year windows) has steadily and markedly increased.
A number of readers asked us about the prior NIH-supported research training and career development of these investigators. Among RPG awardees, what proportion had received prior fellowship, training, or career development (F, T, or K) awards? And perhaps of greater interest, among unsuccessful, unfunded applicants, what proportion had received prior fellowship, training or career awards?
To answer these questions, we start with a quick recap. ….

R01 and R21 Applications & Awards: Trends and Relationships Across NIH

November 4, 2016

As described on our grants page, the R21 activity code “is intended to encourage exploratory/developmental research by providing support for the early and conceptual stages of project development.” NIH seeks applications for “exploratory, novel studies that break new ground,” for “high-risk, high-reward studies,” and for projects that are distinct from those that would be funded by the traditional R01. R21 grants are short duration (project period for up to 2 years) and lower in budget than most R01s (combined budget over two years cannot exceed $275,000 in direct costs). NIH institutes and centers (ICs) approach the R21 mechanism in variable ways: 18 ICs accept investigator-initiated R21 applications in response to the parent R21 funding opportunity, while 7 ICs only accept R21 applications in response to specific funding opportunity announcements. As mentioned in a 2015 Rock Talk blog, we at NIH are interested in trends in R01s in comparison to other research project grants, so today I’d like to continue and expand on looking at R01 and R21 trends across NIH’s extramural research program. ….

Are You On the Fence About Whether to Resubmit?

October 28, 2016

When applicants receive their summary statement resulting from the review of an application that was assigned a score outside of the ICs funding range, there are important decisions to be made that, ideally, should be based upon evidence. What is the likelihood that an application like this one will be funded? If I resubmit the application, what changes might improve the chances for a successful resubmission?

Recall that in 2014, NIH relaxed its resubmission policy (OD-14-074) to allow applicants to submit a new (A0) application following an unsuccessful resubmission application. Also, we recently posted a piece showing that review outcomes for new applications submitted following an unsuccessful resubmission had about the same funding success as other new applications. But some applicants may wonder, what is the funding success for a resubmission application? ….

Applying the Relative Citation Ratio as a Measure of Grant Productivity

October 21, 2016

Last April we posted a blog on the measurement of citation metrics as a function of grant funding. We focused on a group of R01 grants and described the association of a “citation percentile” measure with funding. We noted evidence of “diminishing returns” – that is increased levels of funding were associated with decreasing increments of productivity – an observation that has been noted by others as well.

We were gratified by the many comments we received, through the blog and elsewhere. Furthermore, as I noted in a blog last month, our Office of Portfolio Analysis has released data on the “Relative Citation Ratio,” (or RCR) a robust field-normalized measure of citation influence of a single grant (and as I mentioned, a measure that is available to you for free).

In the follow-up analysis I’d like to share with you today, we focus on a cohort of 60,447 P01 and R01-equivalent grants (R01, R29, and R37) which were first funded between 1995 and 2009. Through the end of 2014, these grants yielded at least 654,607 papers. We calculated a “weighted RCR” value for each grant, ….

Measuring Impact of NIH-supported Publications with a New Metric: the Relative Citation Ratio

September 8, 2016

In previous blogs, we talked about citation measures as one metric for scientific productivity. Raw citation counts are inherently problematic – different fields cite at different rates, and citation counts rise and fall in the months to years after a publication appears. Therefore, a number of bibliometric scholars have focused on developing methods that measure citation impact while also accounting for field of study and time of publication. We are pleased to report that on September 6, PLoS Biology published a paper from our NIH colleagues in the Office of Portfolio Analysis on “The Relative Citation Ratio: A New Metric that Uses Citation Rates to Measure Influence at the Article Level.” Before we delve into the details and look at some real data, ….

Model Organisms, Part 3: A Look at All RPGs for Six Models

August 24, 2016

We are most appreciative of the feedback we’ve received, through the blog and elsewhere, on NIH support of model organism research. In part 1 of this series, we mentioned that we asked two separate groups to analyze NIH applications and awards. In parts 1 and 2 we primarily focused on R01-based data that were curated and analyzed by our Office of Portfolio Analysis. In part 3, we show results from a broader range of research project grant (RPG) data that were prepared and analyzed by our Office of Research Information Systems. This group used an automated thesaurus-based text mining system which delves into not only public data such as project titles, abstracts, public health relevance statements, but also the specific aims contained in RPG applications. ….