There was euphoria at the University of Melbourne as it jumped from 33rd to 14th in the world in the QS 2024 World University Rankings, with champagne corks also popping amongst peers at University of Sydney which has risen 22 places to equal 19th.
Uni Sydney shares the position with auld enemy, UNSW, which improved even more (rising from 45th).
The changes have arisen as a result of alterations to the QS ranking formula, which have increased the weighting for sustainability, international research network and employment outcomes and therefore diluted the emphasis on traditional metrics such as student-teacher ratios and academic reputation. While technically meaning that 2024 rankings can’t be compared with 2023, the changes to the ranking served to give numerous Australian universities a one-off bounce up the QS rankings ladder – leading to a flurry of joyful press releases across the sector.
What are rankings measuring?
Curtin University researcher Professor Cameron Neylon, who is co-lead of the Curtin Open Knowledge Initiative, and consults to UNESCO and key university clusters in Europe and the USA on University rankings and performance, told Future Campus that while everybody loves a good news story, universities attempting to measure success through rankings risked falling behind the rest of the world.
“International rankings are a very narrow and a very biased perspective on what makes a good university and its strategically poor planning to put them at the centre of what your institution should be doing,” Professor Neylon said.
“A lot of the research performance (evaluations in rankings) is based on the count of citations, which is all very well but isn’t it actually more interesting to look at the diversity of where those citations are coming from. If it’s just the same person citing you over and over again that’s just people patting each other on the back, but if you are reaching a wider number of people that would be a more interesting exercise.
“The big questions about what is a university for in society, what is our level of public engagement, how many members of the public are coming on campus or coming on campus in a digital sense to engage in the work we do or tell us what work we should be doing, those are things that are really missing, and are things that some Australian universities do really well.”
The COKI team have developed one of the world’s leading databases to evaluate university performance and are increasingly being asked to advise on performance measurement systems focused on qualitative, rather than just quantitative measurement. Quantitative systems have also been developed to mine data from a far wider range of sources – providing far more detailed insights into university performance across a range of metrics.
Higher rankings make good headlines
Despite the advances in measurement capability, widespread acknowledgement of the limitations of rankings and the impact of changes in methodology, the rankings news drew glowing statements from many universities.
The University of Sydney was quick to acknowledge UNSW and pitch the result as a win for all in the emerald city. “It affirms Sydney’s reputation as being a great global city for higher education,” VC Mark Scott said. UNSW VC Attila Brungs was equally upbeat, saying the result, “enhances our power to facilitate innovative, world-leading research and provide education that transforms lives.”
All the Group of Eight are in the QS global top 100, joined by UTS at 90.
Similar results on Leiden List
The new Leiden research ranking reaches similar conclusions to QS, with the top ten in Australia being the Go8 – Uni Melbourne in its usual top spot, followed by Uni Sydney, Uni Queensland, Monash U, UNSW, UWA, Uni Adelaide and ANU. The other two in the local top ten are Deakin U and Curtin U. Five of them make the world top 100 – Uni Melbourne (28), Uni Sydney (36), Uni Queensland (48), Monash U (49), UNSW (52).