Join us for this episode of the NIH All About Grants podcast to learn more about the format, frequency, and timing of RCR instruction.
In earlier posts, like this one, we discussed the importance of moving towards “evidence-based funding.”. NIH seeks to apply data-driven strategies to conceptualize, develop, implement, and evaluate policies, such as those that will affect the NIH-supported biomedical research workforce. Today, we’d like to spotlight a recently published analysis of an award program directed to investigators early in their careers – a population that has received much attention at NIH and beyond in recent years.
We previously referenced Ioannidis’ and Khoury’s “PQRST” mnemonic for describing research impact: “P” is productivity, “Q” is quality, “R” is reproducibility, “S” is sharing, and “T” is translation. We wrote several blogs about “P,” productivity, focusing on publications, citations, and more recently the relative citation ratio. Now we’ll focus on a different kind of “P” for productivity, namely patents (which arguably are also related to “T” for translation). …. Do NIH-supported papers that are cited by patents have a higher Relative Citation Ratio than those that are not cited by patents? As a refresher, the Relative Citation Ratio uses citation rates to measure the influence of a publication at the article level…. We identified 119,674 unique NIH grants that were funded between 1995 and 2007 and that generated at least one publication….
Measuring the impact of NIH grants is an important input in our stewardship of research funding. One metric we can use to look at impact, discussed previously on this blog, is the relative citation ratio (or RCR). This measure – which NIH has made freely available through the iCite tool – aims to go further than just raw numbers of published research findings or citations, by quantifying the impact and influence of a research article both within the context of its research field and benchmarked against publications resulting from NIH R01 awards.
In light of our more recent posts on applications and resubmissions, we’d like to go a step further by looking at long-term bibliometric outcomes as a function of submission number. In other words, are there any observable trends in the impact of publications resulting from an NIH grant funded as an A0, versus those funded as an A1 or A2? And does that answer change when we take into account how much funding each grant received? ….
Many thanks for your terrific questions and comments to last month’s post, Research Commitment Index: A New Tool for Describing Grant Support. I’d like to use this opportunity to address a couple of key points brought up by a number of commenters; in later blogs, we’ll focus on other suggestions.
The two points I’d like to address here are: 1) why use log-transformed values when plotting output (annual weighted relative citation ratio, or annual RCR) against input (annual research commitment index, or annual RCI), and 2) what is meant by diminishing returns. ….
On this blog we previously discussed ways to measure the value returned from research funding. Several of my colleagues and I, led by NIGMS director Jon Lorsch – chair of an NIH Working Group on Policies for Efficient and Stable Funding – conceived of a “Research Commitment Index,” or “RCI.” We focus on the grant activity code (R01, R21, P01, etc) and ask ourselves about the kind of personal commitment it entails for the investigator(s). We start with the most common type of award, the R01, and assign it an RCI value of 7 points. And then, in consultation with our NIH colleagues, we assigned RCI values to other activity codes: fewer points for R03 and R21 grants, more points P01 grants.
Last April we posted a blog on the measurement of citation metrics as a function of grant funding. We focused on a group of R01 grants and described the association of a “citation percentile” measure with funding. We noted evidence of “diminishing returns” – that is increased levels of funding were associated with decreasing increments of productivity – an observation that has been noted by others as well.
We were gratified by the many comments we received, through the blog and elsewhere. Furthermore, as I noted in a blog last month, our Office of Portfolio Analysis has released data on the “Relative Citation Ratio,” (or RCR) a robust field-normalized measure of citation influence of a single grant (and as I mentioned, a measure that is available to you for free).
In the follow-up analysis I’d like to share with you today, we focus on a cohort of 60,447 P01 and R01-equivalent grants (R01, R29, and R37) which were first funded between 1995 and 2009. Through the end of 2014, these grants yielded at least 654,607 papers. We calculated a “weighted RCR” value for each grant, ….