New NIH Resource to Analyze COVID-19 Literature: The COVID-19 Portfolio Tool

April 15, 2020

In the past few months, the scientific community has ramped up research in response to the SARS‑CoV‑2 pandemic; dozens of peer-reviewed articles and preprints on this topic are being added to the literature every day (Figure 1). This rapidly expanding effort has created challenges for scientists and the medical community who need to analyze thousands of scholarly articles for insights on the virus.

Predicting Translational Progress from Citations of NIH-Supported Fundamental Research

December 17, 2019

By looking to the past we may be able to better understand the flow of scientific knowledge going forward, and possibly even predict translational research outcomes. In their October PLOS Biology paper, Drs. Ian Hutchins and George Santangelo from the NIH’s Office of Portfolio Analysis devised a machine-learning strategy that taps into the trajectory of science by tracking knowledge flow from bench to bedside.

New NIH Resource to Analyze Biomedical Research Citations: The Open Citation Collection

October 16, 2019

My colleagues within the NIH Office of Portfolio Analysis sought to answer this call. Drs. Ian Hutchins and George Santangelo embarked on a hefty bibliometric endeavor over the past several years to curate biomedical citation data. They aggregated over 420 million citation links from sources like Medline, PubMed Central, Entrez, CrossRef, and other unrestricted, open-access datasets. With this information in hand, we can now take a better glimpse into relationships between basic and applied research, into how a researchers’ works are cited, and into ways to make large-scale analyses of citation metrics easier and free.

Teaming with ORCID to Reduce Burden and Improve Transparency

November 15, 2017

As you know, our NIH Strategic Plan articulated an objective to “excel as a federal science agency by managing for results,” and to manage by results we must harness the power of data to drive evidence-based policies. Sometimes, however, our world can be complicated by requirements to enter the same types of data over and over again in one system after another. These situations do have an upside: they provide us the opportunity to look for opportunities to simplify.

Patents and the Relative Citation Ratio: Correlations to Assess NIH Impact

September 18, 2017

We previously referenced Ioannidis’ and Khoury’s “PQRST” mnemonic for describing research impact: “P” is productivity, “Q” is quality, “R” is reproducibility, “S” is sharing, and “T” is translation.  We wrote several blogs about “P,” productivity, focusing on publications, citations, and more recently the relative citation ratio.  Now we’ll focus on a different kind of “P” for productivity, namely patents (which arguably are also related to “T” for translation).  …. Do NIH-supported papers that are cited by patents have a higher Relative Citation Ratio than those that are not cited by patents? As a refresher, the Relative Citation Ratio uses citation rates to measure the influence of a publication at the article level…. We identified 119,674 unique NIH grants that were funded between 1995 and 2007 and that generated at least one publication….

Applications, Resubmissions, and the Relative Citation Ratio

April 25, 2017

Measuring the impact of NIH grants is an important input in our stewardship of research funding. One metric we can use to look at impact, discussed previously on this blog, is the relative citation ratio (or RCR). This measure – which NIH has made freely available through the iCite tool – aims to go further than just raw numbers of published research findings or citations, by quantifying the impact and influence of a research article both within the context of its research field and benchmarked against publications resulting from NIH R01 awards.

In light of our more recent posts on applications and resubmissions, we’d like to go a step further by looking at long-term bibliometric outcomes as a function of submission number. In other words, are there any observable trends in the impact of publications resulting from an NIH grant funded as an A0, versus those funded as an A1 or A2? And does that answer change when we take into account how much funding each grant received? ….

Following up on the Research Commitment Index as a Tool to Describe Grant Support

February 15, 2017

Many thanks for your terrific questions and comments to last month’s post, Research Commitment Index: A New Tool for Describing Grant Support. I’d like to use this opportunity to address a couple of key points brought up by a number of commenters; in later blogs, we’ll focus on other suggestions.

The two points I’d like to address here are: 1) why use log-transformed values when plotting output (annual weighted relative citation ratio, or annual RCR) against input (annual research commitment index, or annual RCI), and 2) what is meant by diminishing returns. ….

Applying the Relative Citation Ratio as a Measure of Grant Productivity

October 21, 2016

Last April we posted a blog on the measurement of citation metrics as a function of grant funding. We focused on a group of R01 grants and described the association of a “citation percentile” measure with funding. We noted evidence of “diminishing returns” – that is increased levels of funding were associated with decreasing increments of productivity – an observation that has been noted by others as well.

We were gratified by the many comments we received, through the blog and elsewhere. Furthermore, as I noted in a blog last month, our Office of Portfolio Analysis has released data on the “Relative Citation Ratio,” (or RCR) a robust field-normalized measure of citation influence of a single grant (and as I mentioned, a measure that is available to you for free).

In the follow-up analysis I’d like to share with you today, we focus on a cohort of 60,447 P01 and R01-equivalent grants (R01, R29, and R37) which were first funded between 1995 and 2009. Through the end of 2014, these grants yielded at least 654,607 papers. We calculated a “weighted RCR” value for each grant, ….

Measuring Impact of NIH-supported Publications with a New Metric: the Relative Citation Ratio

September 8, 2016

In previous blogs, we talked about citation measures as one metric for scientific productivity. Raw citation counts are inherently problematic – different fields cite at different rates, and citation counts rise and fall in the months to years after a publication appears. Therefore, a number of bibliometric scholars have focused on developing methods that measure citation impact while also accounting for field of study and time of publication. We are pleased to report that on September 6, PLoS Biology published a paper from our NIH colleagues in the Office of Portfolio Analysis on “The Relative Citation Ratio: A New Metric that Uses Citation Rates to Measure Influence at the Article Level.” Before we delve into the details and look at some real data, ….