Albarrán, P., C.Herrero, J. Ruiz-Castillo and A. Villar
Journal of Infometrics – 11-2 (2017). 625-640.
Link

Paraules clau:: Citation impact;Additive rules;Tournaments;Relative research performance;Worth


Resum: This paper focuses on the evaluation of research institutions in terms of size-independent indicators. There are well-known procedures in this context, such as what we call additive rules, which provide an evaluation of the impact of any research unit in a scientific field based upon a partition of the field citations into ordered categories, along with some external weighting system to weigh those categories. We introduce here a new ranking procedure that is not an additive rule – the HV procedure, after Herrero & Villar (2013) – and compare it those conventional evaluation rules within a common setting. Given a set of ordered categories, the HV procedure measures the performance of the different research units in terms of the relative probability of getting more citations. The HV method also provides a complete, transitive and cardinal evaluation, without recurring to any external weighting scheme. Using a large dataset of publications in 22 scientific fields assigned to 40 countries, we compare the performance of several additive rules – the Relative Citation Rate, four percentile-based ranking procedures, and two average-based high-impact indicators – and the corresponding HV procedures under the same set of ordered categories. Comparisons take into account re-rankings, and differences in the outcome variability, measured by the coefficient of variation, the range, and the ratio between the maximum and minimum index values.