Measuring the impact of research

From securing grants to demonstrating the societal value of science, measuring impact is crucial

In today’s world — as global challenges seem to mount daily as economic resources decline — measuring the impact of research is becoming increasingly important. Whether you’re a scientist applying for grants or seeking collaborators; a university leader guiding research strategy or decisions about hiring, promotion and tenure; a government official deciding which research to fund; a policymaker justifying research investment to taxpayers; or a CEO seeking to support the UN's sustainable development goals as part of your organization’s growth — it’s crucial to understand the impact research has on your community and beyond.

Here, four members of the research community share their thoughts on research impact and how they measure it.

How research intelligence can help universities maximize their impact

With Dr David Weindorf | VP, Research and Innovation, Central Michigan University

© istock.com/SolStock

© istock.com/SolStock

© istock.com/SolStock

Dr David Weindorf, Vice President for Research and Innovation at Central Michigan University, speaks at a graduation ceremony.

Dr David Weindorf, Vice President for Research and Innovation at Central Michigan University, speaks at a graduation ceremony.

Dr David Weindorf, Vice President for Research and Innovation at Central Michigan University, speaks at a graduation ceremony.

Watch a video about Elsevier's Research Intelligence portfolio.

Watch a video about Elsevier's Research Intelligence portfolio.

Watch a video about Elsevier's Research Intelligence portfolio.

As Vice President for Research and Innovation at Central Michigan University, Dr David Weindorf has a challenging role.

“Sometimes I jokingly tell people I should list firefighter on my CV,” he quipped, “because it seems like I’m just dealing with whatever fire came up today!”

One of the greatest challenges they face, as in many institutions in the US, is dwindling resources. State budgets for higher education continue to be stretched thin and so become difficult to rely on as sources of funding. As a result, David must be cautious about spending every dollar in the most impactful way, while balancing this with the university’s commitment to equity and equality and providing services to faculties across the institution.

This is where research intelligence tools come in.

Research intelligence gives us new ways of looking at how we engage not only within our own university, but how we choose to partner with other universities.
Dr David Weindorf, VP for Research & Innovation, Central Michigan University

Finding impactful partnerships in not-so-obvious places

It can be tempting to focus resources on STEM and medical fields, but non-STEM fields are also valuable and essential to the academe writ large. Using research intelligence tools has given David a new understanding of how faculty can work together across disciplines to support each other.

“It might be something like engaging with somebody in English to facilitate the best technical writing,” he said. “It might be utilizing an animation professor to bring a scientific process to light in a new way for educational purposes. I think there are lots of ways that people can work across the STEM/ non-STEM aisle and help each other. And I think these tools have brought that into specific light for me in a new way.”

Extending your university’s impact

In an environment where institutions compete for talent and funding, raising the profile of a university is crucial, and research analytics can help build that visibility. Using tools such as SciVal and Pure, David has been able to support his institution’s ability to differentiate themselves from other universities in its contribution to the UN Sustainable Developments Goals (SDGs).

“We can differentiate ourselves from other institutions, who either may not be so focused on those SDGs or may not have the insight to know how they’re contributing.”

Universities as tools of diplomacy

By Prof L Rafael Reif | President Emeritus, MIT

Photo by Xavier Lorenzo/Moment via Getty Images

© Xavier Lorenzo/Moment via Getty Images

© Xavier Lorenzo/Moment via Getty Images

Prof L Rafael Reif, PhD

Prof L Rafael Reif, PhD, is President Emeritus and Ray and Maria Stata Professor of Electrical Engineering and Computer Science at MIT.

Prof L Rafael Reif, PhD, is President Emeritus and Ray and Maria Stata Professor of Electrical Engineering and Computer Science at MIT.

Before we block academic exchange for the sake of national security, we must ask ourselves: “What do we risk by not engaging?”

While the parallels are, of course, imperfect, even during the most fraught days of the Cold War, the governments of the United States and the Soviet Union saw reasons to cooperate in academic science. The Lacy-Zaroubin Agreement of 1958 tasked the US National Academy of Sciences and the Soviet Academy of Sciences with promoting a broad series of faculty visits. In the 1960s, the two academies also began sponsoring joint workshops on leading-edge scientific topics. In 1977, a National Academy of Sciences panel led by MIT economist Carl Kaysen found that the program had been a “striking, even spectacular” success in connecting the two countries’ scientists, in giving the US insights into Soviet capabilities in science and technology, and in improving relations between the two superpowers.

Given the brinkmanship last August on the part of the US government over renewing the landmark US-China Science and Technology Cooperation Agreement, it appears that the value of such scientific diplomacy has been somewhat forgotten. The agreement, which dates back to 1979, commits each country to promoting contacts between their people and organizations and clears the way for joint research and the exchange of scientists and students. Just days before it was set to expire on August 27, 2023, the Biden administration extended the agreement for just six months, and some lawmakers on Capitol Hill have expressed the desire to see it expire.

I consider this extremely short-sighted. Past experience shows that scientific exchanges can be helpful in preventing tense relations between nations from going off the rails.

It is not only history that convinces me that scientific cooperation is a powerful way to generate open-mindedness, patience, and fellow feeling between the citizens of rival nations — I am also drawing on personal observation.

I have seen it often, at Stanford University, where I did my graduate studies, and at MIT, where I became a faculty member in 1980: Once faculty and students from countries with long-standing animosities begin working together for a higher cause, they can often overcome their cultural biases and learn to respect each other as peers. 

Please allow me to tell you a bit about my background to explain why I feel so strongly about the value of bringing people from different cultures together to learn from each other, to explore, and to solve problems.

Putting the “competitive” into research intelligence

With Connie Stovall | Director for Research Impact and Intelligence, Virginia Tech

Woman soccer goalie diving to stop the ball (© istock.com/vm)

© istock.com/vm

© istock.com/vm

Connie Stovall shows how her institution is using research intelligence to amplify its impact — and find the best collaborators

When it comes to research intelligence, Connie Stovall, Director for Research Impact and Intelligence at Virginia Tech, thinks in terms of competition and collaboration.

“I refer to my work as competitive intelligence because I’m looking at the market and research landscape,” she explains. “I’m looking at what other institutions are doing — not just scholarly output, but what grants they’re applying for, who they’ve worked with in the past.

“It’s not just about understanding the competition. It’s about understanding who might be a good collaborator.”

Many uses for research intelligence

However one describes the discipline, research intelligence has become an increasingly important tool for universities and individual researchers looking to makes sense of our increasingly complex world:

  • It’s a way of honing grant applications for the greatest chance of success.
  • It can be used to foster inclusion and diversity in the world of research.
  • It’s also a component of research integrity, helping research institutions fill gaps in their expertise by finding the right research partners.
  • It can provide a way to track impact on initiatives such as the UN’s Sustainable Development Goals (SDGs).
  • And it can bolster the impact of research.

As Connie explains:

We have researchers who are using artificial intelligence along with other tools, and they’re working in a multidisciplinary fashion with people in medicine to predict pandemics. Using SciVal, I can map what other institutions are doing, what grants they’ve won, and review all these grants that have already been awarded to understand who was either a competitor or might be a good collaborator to fill the gap.

Finding collaborators

For people looking to expand their use of research intelligence tools, Connie has some advice:

I like the new tool in Scopus — Researcher Discovery — where you can put a particular researcher in (the search), and it gives you a list of similar researchers. That’s a great way of expanding your pool of collaborators beyond the obvious and even to find potential new hires.

Watch a video about how to use Scopus Researcher Discovery.

Connie Stovall is Director for Research Impact and Intelligence at Virginia Tech University.

Connie Stovall is Director for Research Impact and Intelligence at Virginia Tech University.

Connie Stovall is Director for Research Impact and Intelligence at Virginia Tech University.

Watch Connie Stovall's webinar on using research analytics tools to conduct global competitive Intelligence on water research

Watch Connie Stovall's webinar

Watch Connie Stovall's webinar

What determines whether a research article is cited in policy?

By Linda Willems

Researcher Basil Mahfouz gives a presentation on "using cutting-edge methods to track the societal impact of research."

Researcher Basil Mahfouz hopes the answer will help him design a powerful tool for policymakers

How do governments select the research evidence they use to guide their policymaking — particularly when responding to a global crisis?

The obvious answer is that they draw their inspiration from current or highly cited publications. However, a new study by UCL (University College London) researcher Basil Mahfouz suggests the process is not so clear cut.

Basil is a third-year PhD student in UCL’s Department of Science, Technology, Engineering and Public Policy (STEaPP). His PhD project is supported by Elsevier’s International Center for the Study of Research (ICSR) and explores how research impacts society.

As part of that work, Basil recently conducted a case study with PhD supervisors Prof Sir Geoff Mulgan and Prof Licia Capra tracking the impact of research on education policy during COVID-19. As Basil explained:

The pandemic resulted in measures, such as school closures, that disrupted learning for more than 1.5 billion students. And education policymakers worldwide discussed these measures in thousands of policy documents, most of which referenced academic research.

He added:

There were 450,000 scholarly papers published about COVID-19 between March 2020 and December 2022. To put that in context, that’s almost as many as all the research papers on climate change ever published. So, we were interested to see how well education policymakers took advantage of that vast amount of literature.

What they discovered surprised them.

Key findings

  • Policymakers cited “older” research papers more often than newer findings.
  • The policies were more likely to cite recent medical research than recent education research.
  • Policymakers were more likely to cite sources from their own countries.
  • There was a weak relationship between inferred research excellence and citations in policy.

Combining data to gain new insights

For the project, Basil worked in collaboration with Elsevier’s ICSR Lab — a cloud-based computational platform that researchers can use to analyze large structured datasets, including those that power Elsevier solutions such as Scopus and PlumX Metrics — and a partnership with Overton, a global database of policy documents and their citations

Next step — building a new tool for policymakers

Basil aims to continue refining the model he’s built for this case study: for example, by adding a comparative analysis between countries and continuing to optimize the parameters. He also plans to run new case studies on key policy topics such as climate change.

“Each year, government agencies globally invest around $600 billion into research. That’s public money meant to help improve the research ecosystem and also benefit the public. But what are they funding? How are decisions made? A tool like this would provide them with data to guide decisions about where to invest for more evidence-based public policy.”
Basil Mahfouz, PhD researcher, UCL
Researcher Basil Mahfouz of UCL

Researcher Basil Mahfouz of UCL

Researcher Basil Mahfouz of UCL

Basil and his co-authors focused on polices issued between March 2020 and December 2022, containing recommendations or comments on COVID-19 measures for educational institutions. They found that more than 75% of the peer-reviewed papers cited in these policies were published prior to 2020. Using natural language processing, they also established that for 48% of those older research papers, there were newer articles available with comparable abstracts. Basil pointed out: “The average was 70 new papers for every old paper, although it’s worth noting that some papers had a lot while others had very few.”

Basil and his co-authors focused on polices issued between March 2020 and December 2022, containing recommendations or comments on COVID-19 measures for educational institutions. They found that more than 75% of the peer-reviewed papers cited in these policies were published prior to 2020. Using natural language processing, they also established that for 48% of those older research papers, there were newer articles available with comparable abstracts. Basil pointed out: “The average was 70 new papers for every old paper, although it’s worth noting that some papers had a lot while others had very few.”

Basil and his co-authors focused on polices issued between March 2020 and December 2022, containing recommendations or comments on COVID-19 measures for educational institutions. They found that more than 75% of the peer-reviewed papers cited in these policies were published prior to 2020. Using natural language processing, they also established that for 48% of those older research papers, there were newer articles available with comparable abstracts. Basil pointed out: “The average was 70 new papers for every old paper, although it’s worth noting that some papers had a lot while others had very few.”

Related stories

Yeyinou Laura Estelle Loko, PhD | Associate Professor of Zoology and Genetics at National University of Sciences, Technologies, Engineering and Mathematics (UNSTIM)

Open access research that aims to change the world

Researchers reveal why they choose to publish open access

Expanding the research of research

4 ways universities are using research and scholarship to help their communities and the world

Cover of Elsevier's 2023 report Back to Earth: Landing real-world impact in research evaluation

Research impact assessment: a mandate for change

A new survey of the research community shows strong support for assessing research by its real-world impact