Aussie Unis Improve in Global Ranking

white and brown concrete house

Twenty-eight Australian universities improved their overall scores in the latest Time Higher Education World University Rankings (THE WUR), released today.

Australian universities improved their overall scores across 15 metrics, but most significantly in teaching reputation and citation impact.

This is a sharp contrast to last year’s results when 24 declined in overall score. The abysmal results seen last year were largely influenced by THE’s changes in reputation methodology. In addition, the financial circumstances of Australian universities post-pandemic adversely impacted universities’ performance in various per capita measures.

Melbourne remains Australia’s best university in the 2026 edition. Melbourne is ranked 37, up two places from 39 last year, followed by Sydney ranked =53, up eight places from 61 last year.

Monash is ranked third in Australia at =58, unchanged in global rank compared to last year, followed by ANU at =73, which was also unchanged.

UNSW moved ahead of Queensland to rank fifth in Australia. UNSW is ranked 79, up four places from 83 last year, whilst Queensland is ranked =80, down three places from 77 last year.

UTS moved ahead of Western Australia to rank sixth in Australia. UTS is ranked =145, up nine places from last year. Western Australia is ranked 153, down four places from =149 last year, followed by Macquarie at =166, up 12 places.

Overall, Australia retains 10 universities ranked in the top 200. Another 15 are included among the world’s top 400 (i.e. 201-400 band) and seven more are ranked in the 401-500 band. These results demonstrate the homogenisation and lack of institutional diversity in Australia’s higher education system.

Are Aussie unis out of trouble?

Although it is great to be joyful about the improved performance of Australian universities in this year's THE WUR, it is important for this to be contextualised.

Last year’s results placed Australian universities in a perilous position because it was unusual that more than 60% of ranked universities experienced a decline. In previous years, there were fewer Australian universities experiencing a decline.

The deteriorating performance recorded last year was influenced by inaccurate numbers, which overrepresented the size of Australia's research workforce. As I noted last year, Australia’s statutory agency provided updated figures to UNESCO, and the numbers were lower than previously reported for Australia. The size of the research workforce is used to normalise the votes used for institutions in every country. The outdated figures led to Australian universities receiving higher scores than intended and the correction last year contributed to the decline in Australian universities’ performance.

This year’s results reflect the financial position of universities in 2023, when 27 Australian universities reported a deficit (net result after income tax), for the second consecutive year, compared to only three reporting a deficit in 2021. Interestingly, 27 out of 42 Australian universities have reported a positive net result for 2024.

These results also reflect a more subdued rate of growth in scholarly outputs compared to five or ten years earlier. Universities’ research income is not increasing year-on-year as much as necessary to offset competition from Asia's leading universities.

Although we can see several universities are beginning to stabilise their financial position and leadership, there are several universities that are currently experiencing structural changes. The outcomes of these changes are likely to influence performance in global rankings over the next 2-3 years.

Furthermore, the growth in enrolments continues to be largely driven by international students, whilst domestic enrolment numbers are flat.

Although this year's THE results are favourable to Australian universities, the results released from QS in June and ARWU in August suggest the performance of Australian universities is declining. Therefore, the improved performance seen in THE is simply clawing back lost ground from last year because of methodological tweaks.

Ranking schemas measure different things and reflect past performance. I believe we are still in the process of recovery, and we are likely to see increased variability in results across global rankings.

Performance by pillar

Let’s briefly see how Australian universities performed on a pillar-by-pillar (or category) basis:

  • Teaching: 29 institutions increased in score and 6 decreased, compared to last year when only 8 institutions increased and 28 decreased in score.

Performance was largely influenced by the teaching component of the reputation survey, which accounts for 15% of the overall score.

This year, Notre Dame improved the most(4.4 weighted points), followed by Sydney (1.8 weighted points) and Western Sydney by (1.7 weighted points). Australia’s best performers on this pillar are Melbourne, Sydney, and ANU.

  • Research environment: 17 institutions increased in score and 18 moved down, compared to last year when 12 institutions increased and 24 moved down in score.

This year’s performance was influenced by a combination of higher scores in per capita measures and the research component of the reputation survey.

Deakin and Charles Sturt improved the most ( 0.6 weighted points each), respectively, followed by Southern Cross (0.5 weighted points). Australia’s best performers on this pillar are Melbourne, Sydney, and Monash.

  • Research quality: 21 institutions increased in score and 10 moved down, compared to last year when 21 institutions increased and 15 moved down.

Performance was largely influenced by the citation impact measure, which makes up 15% of the overall score. Several institutions’ performance rested on the measures of research strength, excellence, and influence.

Once again, this year Charles Stuart improved the most (2.8 weighted points), followed by Notre Dame (2.0 weighted points( and James Cook (1.6 weighted points). Australia’s best performers on this pillar are ACU, UTS, and Monash.

  • International outlook: 19 institutions increased in score and 15 moved down, compared to last year when 15 institutions increased and 20 moved down.

Year-on-year movement was slight, given the already high volume of international students, a high proportion of international academic staff on campus, and academic staff co-publishing with international peers.

Institutions which improved the most were Central Queensland (0.8 weighted points), Swinburne (0.7 weighted points), and Griffith(0.4 weighted points). Australia’s best performers on this pillar are Curtin, Murdoch, and ANU.

  • Industry: 25 institutions increased in score and 9 moved down, compared to last year when 27 institutions increased and 8 institutions moved down.

Once again, we see institutions improve their performance in both measures of this pillar – patents and industry income per academic staff.

Institutions which improved the most were Western Sydney( 0.5 weighted points), Deakin (0.4 weighted points), and Sunshine Coast (0.5 weighted points). Australia’s best performers on this pillar are Adelaide, Monash, and Melbourne.

Through the lens of university networks

Let us focus on the aggregated performance of universities through the lens of their strategic alliance. For this analysis, previous years scores for Adelaide and UniSA were excluded from the aggregated scores for the Go8 and Australian Technology Network of Universities (ATN).

Over the past nine years:

  • The overall score for members of the Regional Universities Network (RUN) has improved the most; however, only six of its seven members were ranked six times out of the nine years. In 2018, RUN’s overall aggregated score per institution was 31.6 weighted points and it now stands at 45.6 weighted points.
  • The next significant improvement is from the Australian Technology Network of Universities (ATN 5, excluding UniSA). In 2018, their aggregated score per institution was 43.4 and it now stands at 57.1.
  • The overall aggregate score per institution for the Innovative Research Universities (IRU) has increased moderately from 42.7 in 2018 to 52.4 this year. IRU’s rate of improvement is moderately below the rate of improvement for the unaligned institutions, excluding private universities.
  • As for the group of research-intensive universities (Go8, excluding Adelaide), aggregated scores per institution also increased, albeit modestly, from 67.0 in 2018 to 71.5 this year.

It is worth noting that it is easier to make gains as institutions progress from lower bands (e.g. 501-600 to 301-400 or 401-500). The higher the standing of an institution, the harder it is to make more gains year-on-year.

Movers and shakers

Beginning from the top, Melbourne remains unabated as Australia’s best university and is 4.0 weighted points ahead of Sydney. Interestingly, Sydney has the advantage of having higher revenue from international students (and higher total revenue) to mount a challenge to Melbourne’s top position.

Monash maintains strong performance at =58 globally and is up 22 places from 80 in 2018. Meanwhile, ANU and Western Australia have weakened over the past eight years.

Outside the Go8, Macquarie has moved from outside the top 250 in 2018 to 85 this year. UTS has also moved from outside the top 200 to 145 this year.

Among the ATN over the past eight years, Deakin has moved up from outside the top 300 to the 201-250 band; RMIT has moved from the 401-500 band to the 251-300 band, and Curtin has moved from the 351-400 band to the 251-300 band.

Among the regional universities, Southern Queensland has moved from outside the top 600 in 2018 to the 351-400 band, a position which it has sustained over the past three years.

Among IRU members, James Cook ranked in the 201-250 band between 2018 and 2021, dropped to the 401-500 band last year, and has moved now to the 351-400 band.

Ranking methodology

This is the third consecutive year of THE’s new methodology introduced with the 2024 edition, which includes 18 metrics across five pillars. In brief:

  • The teaching pillar accounts for 29.5% of the overall score and consists of five indicators. One of these is the academic reputation survey, which weighs 15% of this pillar, along with four per capita measures covering student enrolments and completions, academic staff, and institutional income derived from information provided by institutions.
  • The research environment pillar accounts for 29% of the overall score and consists of three indicators. One of these is the reputational survey, which weighs 18% of this pillar, along with two per capita measures (research income and research productivity) derived from information provided by institutions.
  • The research quality pillar accounts for 30% of the overall score and consists of four indicators. The citation impact weighs 15% of this pillar and is based on Elsevier’s field weighted citation impact (FWCI). Last year, THE added three measures based on the FWCI with a twist: research strength (75th percentile), research excellence (top 10% of all publications by FWCI), and research influence.
  • The international outlook pillar accounts for 7.5% of the overall score and is based on three equally weighed indicators. One of these is international co-authorship drawn from Elsevier’s Scopus database. The other two refer to the proportion of international students and staff. Institutions provide information for the latter two indicators
  • The industry pillar accounts for 4% of the overall score and is based on two equally weighted indicators. The first measure refers to the proportion of income for research and consultancy drawn from industry and is provided by institutions. The second measure refers to patents, measuring the number of patents citing a university's published research.

THE uses bibliometric data supplied by Elsevier from peer-reviewed journals indexed in Scopus database and includes all indexed publications published between 2020 and 2024. Citations to these publications cover the six-year period from 2020 to 2025.

In addition, THE conducts its own academic reputation survey annually. For this year’s edition, THE uses the responses gathered between November 2024 and January 2025. THE uses this year’s responses and the response gathered during the previous survey cycle, giving more than 108,000 responses.

Participating institutions across the world

Over the past 10 editions, the number of universities ranked globally has increased by 160% from 844 in 2016 to 2,191 in the 2026 edition.

For this year’s edition, 3,168 institutions submitted data across 115 countries compared to 1,128 institutions across 70 countries in 2016.

The total number of universities from India included in the rankings increased from 42 in 2018 to 128. Mainland China has increased its number of participating universities from 63 in 2018 to 97. The United States continues to have the highest number of ranked universities at 171.

Parting thoughts

Over the years, I have argued that stability in results is important to the credibility of any ranking schema. Let us hope THE minimises the extent to which there are ongoing adjustments.

Undoubtedly there will be questions asked about the validity and integrity of the research quality pillar, which seems to reward outputs that receive high volumes of citations. THE could reduce the weight of this pillar and increase weight to metrics in the teaching pillar, such as doctorate to bachelor ratio and doctorate to student ratio. They could even introduce a metric that speaks more broadly about learners’ lifelong learning. By doing so, THE would increase recognition of teaching endeavours, which are undervalued in global rankings.

We also need to be mindful in drawing inferences about single year performance as we often see methodological adjustments and changes to weighting across indicators, which can lead to perverse year-on-year results.

It is always helpful to look at the trend data over five, eight, or more years and examine the data as it appears in the various bibliometric databases, national repositories, and open-source databases.

Despite the criticism of global rankings, there is still a strong appetite from institutions to participate in global, specialised, or national rankings and ratings. We continue to see new schemas emerging year on year.

This is also a reminder that the world of higher education is increasingly being shaped by businesses and commercial practices which benefit from quantifiable metrics and rankings.

Angel Calderon is Director, Strategic Insights at RMIT University.

Share:

Facebook
Twitter
Pinterest
LinkedIn

Sign Up for Our Newsletter

Subscribe to us to always stay in touch with us and get latest news, insights, jobs and events!!