Australian Catholic University is the nation’s top ranked institution in terms of the proportion of its research that is highly cited, ranked 20th in the world, followed by UTS in 36th place, according to the world’s first fully-transparent ranking system.
The release of what is believed to be the world’s first fully-reproducible, transparent ranking system by Leiden University’s Centre for Science and Technology Studies sets new benchmarks for Australian universities.
A total of 17.4% of ACU publications released between 2018-21 were ranked in the top 10% in terms of number of citations per paper, with 16% of UTS papers achieving the highly cited grade.
The metric is just one of several ratings within the new CWTS Leiden Open Edition Ranking framework, which is destined to generate new discussions about university performance and the use of rankings in university strategies and Vice-Chancellor’s performance plans.
Many of the Open Edition’s results are similar to those of Leiden’s traditional ranking, which is based on Clarivate’s Web of Science database. In contrast, the new CWTS Leiden Ranking Open Edition relies on data from the OpenAlex database and provides a multidimensional view of university performance – with indices across collaboration, scientific impact and open data.
Several rankings result in a familiar pecking order for Australian Universities – with The University of Melbourne ranking 23rd and 24th in the world for scientific impact based on the volume of publications they produce. Other indices within the Open Edition Rankings framework prompt a fresh consideration of university performance.
The new approach enabling universities to compare their performance against multiple metrics, developed by CWTS in collaboration with Australia’s Curtin University Open Knowledge Initiative (COKI) opens the door to new conversations, and sets a new benchmark for rankings integrity.
CWTS Director Ludo Waltman said the new rankings were a pilot project for a new open-access based rankings system that would ultimately replace the existing Leiden rankings. CWTS has committed to developing systems based on open data to improve the integrity and relevance of university performance metrics into the future.
This follows long-term criticism of other ranking systems for reliance on metrics such as peer perception that can be manipulated by well-resourced universities.
“This new ranking offers a level of transparency that has never been available before,” Professor Waltman said.
“It gives us the opportunity to understand what the metrics, the statistics really mean.”
The Leiden Ranking Open Edition is a collection of multiple rankings, where institutions are ranked not just by the sheer volume of their outputs, but also the proportion of those outputs that are high quality.
CWTS have resisted the impulse to compile these metrics into a single, overarching index, arguing that universities should be diverse, with performance evaluated against their strategic goals, rather than all forced to aspire to the same metrics.
For example, a university seeking to produce a large volume of research publications regardless of how much those publications are read or cited by others could measure their outputs in terms of publication numbers, whereas an institution focusing on quality and impact could use the proportional measure.
The Open Edition rankings appear likely to prompt a range of discussions, including whether universities with lower volumes of research can be compared with larger institutions, and whether high citation rates are a fair or valid proxy for research quality.
They are likely to be closely scrutinised by funders and regulators, raising questions about research quality and impact, efficacy of collaborations and commitment to open access.
The ranking provides the opportunity to analyse the performance of universities in a range of ways, using interactive tables allowing results to be filtered by country, field and time period.
Many parts of the ranking are published with confidence levels for each level of this data, helping analysts to better understand statistical anomalies and identify areas where data quality could be improved.
“Ultimately this is a starting point for a transition we are going through, where we all become more used to not just blindly giving certain imputations to certain metrics,” Professor Waltman said.
Professor Waltman said that universities who use Leiden rankings as evidence of being one of the world’s leading universities appeared to be ignorant of what the rankings really meant.
The Open Edition Rankings will help trigger new conversations about the meaning of rankings and how rankings data could be better used to measure university performance in ways that align with institutional strategy.
“If you are in the top 10 of a particular dimension that really matters to you (such as collaboration), it’s perfectly fine to advertise that; but pay attention – it should really matter to you and align with your particular mission,” Professor Waltman said.
“The dynamics of the reputation game will kind of change and I expect that ultimately we will push people into thinking more carefully about what they have interpreted,” he said.