Leiden Ranking fully discloses method

Has the University of Amsterdam really published that many well-cited papers? If you don’t trust the new Leiden Ranking, you can verify it yourself from now on. The black box of rankings has been opened, says director Ludo Waltman.

by
photo Customdesigner / iStock

For fifteen years now, research centre CWTS has been making its own world ranking of universities: the Leiden Ranking. It allows you to select certain criteria, thereby creating alternative versions.  The idea behind it: other world rankings reduce the achievements of universities to a single outcome, but there are all kinds of possible perspectives. One criterion isn’t necessarily better than another.  

Now, the makers have taken things to the next level. They’ve launched an ‘open edition’. This means you can’t only choose your own criteria, but also verify the underlying data. 

Black box 

The Leiden Ranking used to be a kind of ‘black box’, says Director Ludo Waltman in an explanatory note. But in the Open Edition, the data and algorithms used are public. In principle, anyone can now make their own ranking. Waltman: “Everyone can decide for themselves what they think is important to measure a university’s performance.” 

For the regular Leiden Ranking, which also still exists, CWTS uses data from what is known as the ‘Web of Science’, but those aren’t accessible to everyone. The new edition is based on data of OpenAlex, which can be downloaded by anyone. 

This causes differences between the open edition and the old edition. For example, what share of scientific articles belonged to the best-cited ten percent in the world over the past years? For the University of Amsterdam, this percentage is 14.9 or 15.7, depending on which edition you consult.  

Shifts 

This means the ‘charts’ also change a bit. In both versions the University of Amsterdam takes top position, but in the old edition Erasmus University Rotterdam is in sixth place and in the open edition in third place. At least, if you look at this particular criterion. You could also choose the best-cited 1 percent, or the percentage of open access publications, for instance. 

Rankings come in for a lot of criticism. Dutch universities have decided to place less emphasis on their ranking score, if only because the criteria raise questions: how much weight do you assign, for instance, to the reputation of institutions compared to their scientific impact?  

What’s more, there’s more to a university than the number of citations. Articles on the earthquakes in Groningen or Dutch healthcare are less likely to be published in the most prestigious journals of the world, but does that make them less valuable? And if a researcher shifts their focus to education, is that harmful to a university’s position in a world ranking? 

Recognise and reward 

In the context of ‘recognise and reward’, the universities want to place more emphasis on the different tasks that staff can carry out: doing research, teaching, spreading knowledge, managing people, etc. This doesn’t gel with focusing on rankings. 

This is why Utrecht University is missing in the ranking of British magazine Times Higher Education. The university no longer supplies its own data to the makers and is therefore no longer included in the ranking. But Dutch universities aren’t in complete agreement about this: the other institutions are still participating. 

Share this article