The Italian system has assigned a key role to bibliometric indicators for the hiring and promotion of professors. This system suddenly transformed the bibliometric indicators into crucial gateways to academic career improvements, at the same time leaving room for a strategic use of citations. The striking result is that the effects of the system have become visible at a national scale and in most of the scientific fields. The Italian scientific miracle is an illusion. (Articolo pubblicato su Times Higher Education)

In 2010, the Italian research system was profoundly changed by a landmark university reform.

In the name of efficient use of public money, research funding was cut, the recruitment of researchers was curtailed and a complex evaluation system was introduced to monitor and improve the quality of Italian research.

At the heart of that system are bibliometric indicators. They play a central role not only in the assessment of departments and research centres, but also in the recruitment, promotion and evaluation of individual researchers. For instance, researchers must now undergo a national scientific habilitation that cannot be passed unless their citations, h-index and output exceed the national median in their field.

On the surface, the reform appears to have had the desired effect. Notwithstanding the reduction in personnel and funding, Italian science achieved a kind of miracle, increasing both the volume and citation impact of its publications. On the latter measure, Italy overtook the US in 2012, rising to second place behind the UK among the G8 countries, according to Elsevier’s SciVal Analytics. In a 2016 report commissioned by the UK government, Elsevier predicted that, “based on current trajectories, [Italy is] set to overtake the UK in the near future”.

Meanwhile, a Nature editorial in August observed that “although Italy’s spending on research and development – at around 1.3 per cent of its gross domestic product – sits well below the EU average of 2 per cent, its research performance continues to improve”.

And, according to ANVUR, the Italian governmental research evaluation agency, “Italy is an international exemplar in the design and implementation of research evaluation exercises”. Its 2018 biennial report says that the “sophisticated use of multiple indicators” to “deal with the problem of comparison between disciplines” has meant that “in recent years Italian universities have shown a gradual convergence towards higher standards, both in teaching [and] research”.

However, bibliometric-centred policies have a dark side, which is increasingly being brought to light. Indicators invite gaming. Some of the highly cited researchers identified by Clarivate Analytics are, in fact, massive self-citers. And, in Italy, some recent studies document how, in some fields, self-citation has increased significantly since the reform.

Our study, “Citation gaming induced by bibliometric evaluation: A country-level comparative analysis”, published in September in Plos One, confirms that the recent sharp rise in Italian citation impact is, in essence, a mirage, generated by a massive national change in citation habits after the 2010 reform.

To illuminate the real situation, we introduce a new indicator called “inwardness”, defined as the ratio between the total number of a country’s self-citations and the total number of citations received by that country in the same time frame. It is able to track not only individual self-citers but also intra-national citation clubs.

We observe that, after 2009, most scientific fields in Italy saw their inwardness jump: a trend unique among European members of the G10. In 2016, Italy became – both globally and for a large majority of research fields – the country with the second highest inwardness and the second lowest rate of international collaborations, behind only the US.

The likely explanation is that the bibliometric targets have stimulated large numbers of Italian scientists to both increase their level of self-citation and join citation-exchange clubs.

So what are the lessons of this huge case of collective citation doping? It might be claimed that the perverse effects of bibliometric targets could be countered by adopting better indicators. However, excluding self-citations from calculations would just reinforce the incentive to create citation clubs – which, by their nature, are elusive.

In the end, there is no escape from Goodhart’s famous maxim that when a measure becomes a target, it ceases to be a good measure. Our results show that any new indicator is doomed to be matched by new forms of strategic adaptation, feeding an endless perverse cycle.

Miracles do not happen. There is no magic wand – bibliometric or otherwise – able to transform the scientific performance of a country while also cutting investment.

Pubblicato il 7 ottobre 2019 su Times Higher Education

Print Friendly, PDF & Email

Questo sito usa Akismet per ridurre lo spam. Scopri come i tuoi dati vengono elaborati.