Leiden Ranking Reveals Changing Research Geographies

desk globe on table

​Now in its eighteenth consecutive edition, the CWTS Leiden Ranking was released on 29 October, including 1,594 universities; 88 more than last year, or 418 more compared to 1,176 in 2020.

This release also includes the new Open Edition, containing 2,831 universities. The Open Edition comprises 39 Australian universities, with 35 in the former (now renamed the Traditional Edition).

Leiden has now joined THE and QS in producing mega rankings year-on-year.

The Leiden Rankings are produced by the Center for Science and Technology at Leiden University in the Netherlands. The Leiden Rankings derive a set of scientific impact, collaboration, and open access measures, but they do not produce an overall rank.

In this commentary, I will focus on some methodological nuances of both editions, noting variations in geographical coverage. I will also highlight some opportunities and concerns for our universities, particularly due to the extent to which open access is embraced globally.

I conclude with some observations on the development of the Australian Research Council’s research evaluation capabilities model, which could further add to the administrative burden in universities.

Context and background

First published in 2008, the Leiden Ranking covered the 500 largest and most research-intensive universities in Europe and the wider world. Over the years, it has expanded considerably in geographical coverage. This year’s Traditional Edition includes universities across 77 countries and the Open Edition across 120 countries.

Leiden relies entirely on bibliometric measures, and as such is not dependent on institutions like QS or THE. The Traditional Edition relies on data from Clarivate’s Science Citation Index Expanded, the Social Science Citation Index, and the Arts & Humanities Citation Index. The Open Edition relies on bibliographic data from the OpenAlex database.

This year’s editions cover publications in the period from 2020 to 2023.

Both editions provide details at the institutional level but also for five main fields of science. The Leiden rankings are, like any other global rankings, a minefield of benefits; just about everybody can extract a useful measure to boast about.

However, it is important to note that the Leiden Ranking is not intended to attract the attention of students; it’s intended for policy makers, researchers, and research & institutional planners. It also only covers publications written in English. This inherently favors research that aligns with the dominant Western academic and publishing culture.

The underlying data that is used in the ranking is made available so any user can analyse the results in depth. The data file structure has remained stable, and it is easier to retrieve the results – although the file size has increased significantly as more years and institutions are added. The Traditional edition file is 204.8 MB in size compared to 3.2 MB for the 2013 edition. The Open edition file is 589.2 MB in size and contains 15 years’ worth of data.

Changing research geographies

The Leiden Ranking results provide us with a snapshot of how the knowledge production landscape is rapidly changing and shows the increasing influence of open access in research endeavours. In the Traditional Edition, 47% (or 738) of the universities are from Europe and North America compared to 41% (or 1163) in the Open Edition.

In contrast, 44% of universities are drawn from Asia in the Traditional (692) and Open (1245) Editions.

All other world regions have 9% (or 150) of universities included in the Traditional Edition but 15% (or 423) in the Open Edition.

Open access deepening shifts

Let me focus on a comparison between institutions which appear in both editions (1,580 globally and 35 in Australia).

As expected, the Open Edition includes more publications than the Traditional Edition.

Among the world’s top 20 countries in scholarly outputs, Australia is eighth in both the Traditional and Open Editions. Australian universities have 21% more publications counted in the Open Edition than in the Traditional Edition. Australia is on par with the United States, Japan, and many other liberal economies, but there is a need for Australian universities to increase uptake in open access publishing.

By contrast, India is counted as tenth in knowledge production in the Traditional Edition but fifth in the Open Edition. Indian universities have 63% more publication in the Open Edition than in the Traditional Edition.

The world regions that are experiencing rapid growth in open access publishing are South-eastern Asia and Southern Asia, but also in Northern Africa and Sub-Saharan Africa albeit at a lower volume.

Universities from Australia, Europe and North America are already well covered in Clarivate’s citation indexes, given the focus of publishing in high-impact and English-language journals.

From an economic perspective, it is not surprising that universities from low-income and middle-income countries have greater visibility in open access publications than in the subscription-based databases of Clarivate and Elsevier. Open access is based on the principle of publicly available information, allowing easy access to research discoveries. Therefore, low-income and middle-income countries are preferencing open access over subscription-based models.

This contrasts with universities from Australia, the United States, the United Kingdom, and other liberal economies, which have constructed their research infrastructure in alignment with the international fee-for-service bibliometric conglomerates.

Australian stand outs

It will be remiss of me to not highlight the performance of Australia’s leading universities.

However, the results from the Leiden rankings reinforce the view that we have a multi-tier system of universities in Australia. The group of five research intensive universities (Melbourne, Monash, Queensland, Sydney and UNSW) have the greatest annual volume of publications (between 13,003 to 15,085), compared to the other three universities in the Go8 (ANU, UWA and Adelaide). The group of five universities also have the greatest share of total revenue.

The trajectory for various ATN universities suggests that they could overtake the latter universities in the volume of publications a few years down the road. This middle group of institutions is influenced by their relative size of the student population and relative share of total revenue.

The other striking observation is that several of the Dawkins era universities are showing higher rates of citation impact than the sandstone or traditional universities, in part driven by the areas of research specialization. For example, Swinburne, Charles Darwin, Edith Cowan, and Southern Queensland rank top 50 globally on the mean normalized citation score in both the Traditional and Open Editions.

Although the mean normalized citation score for Australian universities does not vary too much in the Open Edition compared to the Traditional Edition, sandstone and pre-Dawkins universities have lower ranks in the Open Edition than in the Traditional Edition. This may be influenced by the adoption of open access publishing and in the fields of research investment.

It will be interesting to see whether the institutional results of the Traditional and Open Editions continue to evolve over the next few years. Universities and national systems that make strategic decisions and maintain a clear view of long-term improvement are likely to be the primary beneficiaries.

Australia’s dormant research evaluation

A few months ago, the Australian Research Council (ARC) released a consultation draft for developing a new, more flexible ‘research insights capability’ model as the previous national system of evaluation (Excellence in Research for Australia – ERA) was discontinued.

ERA lived in the shadow of global rankings, and it was an exercise in academic rigor and debate. However, ERA was a costly exercise for all but had no rewards, no funding, no appeal to students or researchers; beyond academia and research management, ERA did not make sense.

The ARC needs to ponder its best path forward in building a new model for research evaluation. It may not end up developing a model of research evaluation as there is already a plethora of products and services to do so.

Australia’s higher education statistical data collections have served us well over the years. We need to ensure these are strengthened and are fit for purpose for Australian universities to remain relevant and competitive domestically and internationally.

The ARC may end up simply building a framework for universities to assess their overall performance. In doing so, it needs to assist universities in demonstrating how they are contributing to society and economy.

As there is not a single database that will satisfy policy makers and university leaders, universities could well use a combination of resources to satisfy the ARC requirements.

The ARC has said that it is seeking new potential data sources and building AI capabilities. Australian universities are already well covered in the major bibliometric conglomerate of Clarivate and Elsevier and are continuing to expand on open access publishing.

Therefore, a new evaluation model will require a level of new or redirect investment. Whichever form or shape the new research evaluation model is, government and institutional funding will be needed. This prompts me to ask for what purpose.

We need to develop an education and research evaluation framework that encompasses all aspects of university activities and that makes the holistic link across all areas.

Angel Calderon is Director, Strategic Insights at RMIT University.

Share:

Facebook
Twitter
Pinterest
LinkedIn

Sign Up for Our Newsletter

Subscribe to us to always stay in touch with us and get latest news, insights, jobs and events!!