News

Scientists at odds on Utrecht University reforms to hiring and promotion criteria

Not everyone wants to let the journal impact factor go.

  • Dalmeet Singh Chawla

Credit: sorbetto/Getty Images

Scientists at odds on Utrecht University reforms to hiring and promotion criteria

Not everyone wants to let the journal impact factor go.

9 August 2021

Dalmeet Singh Chawla

sorbetto/Getty Images

Academia in the Netherlands has erupted in disagreement over the move by a Dutch university to stop considering journal impact factors when deciding whether to hire or promote academics.

In June, Nature reported that Utrecht University had formally abandoned the journal impact factor (JIF) when making decisions on hiring or promoting staff. The JIF was initially developed to help librarians choose which journals to subscribe to, but researchers have repeatedly raised the alarm about it being misused to judge the calibre of individual researchers and research papers.

Researchers based in the Netherlands — including Nobel Prize-winning organic chemist Bernard Feringa, former education secretary Ronald Plasterk and 170 others — penned a critical open letter arguing that Utrecht’s move will result in more randomness and arbitrariness in promotions and hiring.

Raymond Poot, a cell biologist at Erasmus University Medical Center in Rotterdam, who co-authored the letter, argues for continued use of metrics such as the JIF alongside other factors when judging researchers. “We think the JIF is an imperfect but nevertheless useful metric,” he says. He wants Utrecht to reverse its decision.

Under Utrecht’s new policy, every department will judge its scholars by measures such as their dedication to teamwork, level of public engagement, leadership and the extent to which they practice open science.

But Poot argues that these factors are not scientific, but political, and are difficult to measure and use fairly when comparing scientists.

“Serious negative consequences”

“We are concerned that Utrecht’s new ‘recognition and rewards’ system will lead to randomness and a compromising of scientific quality, which will have serious consequences for the recognition and evaluation of Dutch scientists,” Poot and Willem Mulder, a professor of precision medicine at Radboud University Medical Center and Eindhoven University of Technology, write in their letter, a translated version of which has appeared in Times Higher Education.

“In particular, it will have negative consequences for young scientists, who will no longer be able to compete internationally,” they add. For the research track of the medical and life sciences, internationally recognized and measurable criteria must be paramount, they argue.

Stephen Curry, a structural biologist at Imperial College London in the United Kingdom and chair of the Declaration on Research Assessment (DORA), which advocates for bias-free and efficient alternatives to evaluate research, says Poot and Mulder’s letter is out of touch.

“Impact factors introduce many perverse incentives into research assessment,” Curry says. “They are contributing to a huge rise in stress and mental ill health among researchers, they incentivize fraud, [and] they delay the progress and publishing of science.”

More than 20,000 individuals and organizations — including several Dutch universities — from 148 different countries, have have signed up to DORA.

Judith de Haan, program manager of open science at Utrecht, says that in addition to research, her university will focus on other academic activities, such as teaching and public engagement. She notes that researchers will still be able to point to their publications in high-impact journals in so-called narrative CVs, as long as they explain why the work is important. “It’s not easy, but the alternative to do it with a flawed measure like the journal impact factor is not what we want, either.”

“Change is always accompanied by uncertainty and resistance, especially when the stakes are high,” adds Sarah de Rijcke, a professor of science and evaluation studies and director of the Centre for Science & Technology Studies at Leiden University in the Netherlands.

“I think the real issue is not arbitrariness. It is that we should all unlearn to use unhelpful shortcuts and proxies, and re-learn how to undertake in-depth, contextual evaluation. And funders can help by not being overly bureaucratic, by taking more care to train evaluators, and by providing clear guidelines for researchers and evaluators.”

Out of the 172 researchers who signed Poot and Mulder’s letter, 143 are professors. Critics suggest that many of the signatories are bound to favour the old system, because it helped them get where they are today.

While some early-career scientists have voiced support for Poot and Mulder’s objections, many more have opposed them in a separate open letter, saying that Utrecht is right to move away from using impact factors to evaluate researchers.

Their letter, signed by 383 scientists at the time of writing, asserts that as other ways of assessing high-quality science gain recognition, a favorable bibliometric indicator is no longer needed as a universal sign of excellence.

Need for more transparency

Nearly 70% of respondents to the annual Responsible Metrics State of the Art Survey, released in April on The Bibliomagician blog said they had at least considered developing a set of principles to guide responsible use of metrics, up from 50% in the previous year’s survey.

Of the 139 respondents, 90% work at universities and research institutions. Commenting on the results, Nicholas Robinson-Garcia, a social scientist at the University of Granada in Spain who conducted the survey and specialises in bibliometrics, said in the five years the survey has been running, institutions seemed to have progressed from gaining awareness about the issue to considering whether or how to implement responsible metrics use.

“DORA adoption continues to increase, but professionals indicate an ambivalent response from academic communities to these policies,” the survey report says. “Many of the malpractices surrounding the use of metrics seem to be quite entrenched in academic culture.”

Robinson-Garcia is concerned that widespread adoption of narrative CVs may not be a forward step. “The reason why metrics were introduced originally was actually to remove this kind of subjective thing,” he says.

Tokameh Mahmoudi, a biochemist at Erasmus University Medical Center and a co-signatory of Poot and Mulder’s letter, worries that the lack of transparency in how researchers will be judged may lead to a rise to nepotism and a drop in the quality of Dutch science.

“I do, of course, agree that this is an imperfect system that needs to be improved,” Mahmoudi says. “But doing away with [impact factors] completely without clarifying exactly how we’re going to be assessing researchers and research is uncomfortable for me. I prefer to have more transparency there.”