by ANGEL CALDERON

Times Higher Education (THE) has released its 2023 edition of its World University Rankings on 12 October. Whilst a mammoth edition, the worry is that improvement for many universities rests on the citation impact, a weak measure of research strength. Over the past eight editions, the number of universities ranked has increased by 125 per cent, from 800 in 2016 to 1799 in the 2023 edition.

Although the annual increase of ranked institutions is 8 percent from last year (1662 institutions), its overall increase of 137 new institutions is the second highest increase since 2016. The 2019 edition was the highest net increase when THE added 221 institutions.

This ongoing expansion not only illustrates the appetite from institutions to participate in global rankings but is also a reminder that the world of higher education is increasingly being shaped by businesses and commercial practices.

Before we concentrate on the performance of Australian universities, let us focus on the changing global dynamics and the methodological construct of this ranking.

Changing geopolitical landscape

The 2023 edition includes universities from 104 countries, up by 34 from 70 countries in 2016.

Compared to last year, the number of Australian universities remain unchanged at 37. Last year there were 29 ranked in the world’s top 500. This year there are 31 with the inclusion of Bond University, Charles Darwin University, and Murdoch University, whilst Victoria University dropped out of the top 500.

Among the top 500, there are 105 universities from the United States unchanged compared to 109 last year and 122 in the 2016 edition. There are 59 universities from the United Kingdom, unchanged from last year and one fewer compared to the 2016 edition. China increased to 27 from 24 last year, an increase of 16 when compared to its 11 in the 2016 edition.

Australia is equal fifth to the Netherlands in the number of universities in the top 200 (ten), two fewer compared to last year but two more compared to eight in 2018.

China is fourth with 11 universities and Germany is third with 22. The USA and the UK remain first and second. There is a noticeable long-term decline in the standing of universities from the USA and the UK, while there is an improvement in standing for universities from Asia and the Pacific, including Australia and New Zealand.

As a system, Australian universities have continued to progress in global rankings, even though their resilience and financial sustainability was tested by the pandemic years. As I have previously observed, the overall standing of Australian universities is not guaranteed (CMM, 2 September 2021), particularly for those institutions relying on citations or highly cited researchers. In any case, the imperative for change and long-term planning are paramount for university missions.

Ranking methodology

This is the last year in which THE uses the 2011 broad methodology. At a later date, I will discuss the methodology to be used from next year. Currently, the ranking consists of 13 performance indicators which are grouped into five pillars. Briefly:

* the teaching pillar accounts for 30 per cent of the overall score and consists of five indicators. One of these is the academic reputation survey, which weighs 15 per cent of the score, along with four per capita measures derived from information provided by institutions.

* the research pillar accounts for 30 per cent of the overall score and consists of three indicators. One of these is the reputational survey, which weighs 18 per cent of the overall score, along with two per capita measures derived from information provided by institutions.

* citation impact accounts for 30 per cent of the overall score and is based on the Field Weighted Citation Impact (FWCI) as derived by Elsevier

* international outlook accounts for 7.5 per cent of the overall score and is based on three equally weighed indicators. One of these is international co-authorship drawn from Elsevier’s Scopus database. The other two refer to the proportion of international students and staff. Institutions provide information for the latter two indicators

* industry income accounts for 2.5 per cent of the overall score. It refers to the proportion of income for research and consultancy drawn from industry and is provided by institutions.

performance of Australian universities

Compared to last year, 23 universities moved up in overall score, 13 moved down and one stayed the same. This year’s improvement was driven by the research pillar followed by the citation pillar.

Let’s briefly describe how Australian universities performed on a pillar-by-pillar basis:

* teaching: 16 moved down, 20 went up and one remained unchanged. The University of Adelaide improved the most (by 2.1 weighted points) followed by University of Newcastle and University of Southern Queensland (both by 0.9 weighted points).

* research: ten moved down and 27 moved up. Monash University improved the most among Australian universities (by 2.1 weighted points) followed by Uni Adelaide (by 2.0 weighted points).

* citation impact: 20 universities went up and 17 went down. Bond University improved the most (by 11.3 weighted points), followed by Federation University and Charles Darwin University (both up by 5.8 and 5.0 weighted points, respectively).

* industry income: 11 universities improved in score, 20 remained unchanged and six went down. Uni Sydney improved the most (by 0.2 weighted points). There is not much change on this pillar year on year.

* international outlook: Four universities improved in score, 20 remained unchanged and 13 went down in score. James Cook University and Southern Cross University improved the most (by 1.1 and 0.7 weighted points, respectively).

Australia’s top one-year improvers

Ranked in the world’s top 100

* Monash University moves up 13 places from 57th last year to 44th and becomes Australia’s second institution behind Uni Melbourne (down one place to 34th). Monash U’s improvement is driven by higher scores in the research pillar, then in the teaching and citations pillars.

* Uni Adelaide returns to the top 100 at 88th after it dropped to the 201-225 band in 2012. Since then, it has made sustained progress. This year’s breakthrough was driven by improved scores in the teaching and research pillars.

ranked in the world’s top 200

* Macquarie U moves up 17 places from 192nd last year to 175th. This is the third consecutive year in which Macquarie ranks in the top 200, with improvement driven by citations.

* UTS moves up tenplaces from =143rd last year to 133rd, with improvement driven by citations.

ranked in the world’s top 300

* Bond U (251-300 band) moves up from outside the top 500 last year, with improvement driven solely by citations.

ranked in the world’s top 400

* Charles Darwin University (351-400) also moves up from outside the top 500 last year. Improvement is largely driven by citations and to a lesser extent, by research.

* University of Southern Queensland (301-350) also moved up in band from the 401-500 band last year, with improvement driven by citations.

going behind the citations pillar

The citation pillar is based on the Field Weighted Citation Impact (FWCI) as derived by Elsevier. For those unfamiliar with this measure, the FWCI score indicates how the number of citations received by an institution’s publications compares with the average number of citations received by all similar publications. A FWCI of 1.00 indicates the global average.

Last year I observed that some universities experienced a decline in THE’s citation impact, because the first global burden of disease studies (2015 published), with many, many authors were falling out of the five-year window.

how a high FCWI means a higher THE ranking

In 2021, Bond University had the highest FWCI (excluding self-citations) of all Australian universities at 5.74 compared to 1.30 in 2020. Swinburne University had the second highest FWCI in 2021 at 1.81 compared to 1.52 in 2020. Among the Go8, Monash University had the highest FWCI at 1.78 compared to 1.68 in 2020.

Typically, there is not much which separates an institution’s FWCI one year to the next (these may be separated by 0.01 points).  So, if an institution moves up the way Bond did in the FCWI, there is a guarantee the institution will skyrocket in global rankings.

Bond University has had one Highly Cited Researcher for the past five years, and this researcher’s output helps to explain the exponential increase in the FCWI measure, which in turn explain why Bond has improved considerably in THE WUR.

In previous years, the University of Canberra had the highest FCWI of all Australian universities but one of the lowest in 2021. Uni Canberra outperformed other Australian universities in THE WUR, driven by citations. Canberra first entered the world’s top 200 in 2020 when it ranked 193rd, then it ranked =170th in 2022 and is now ranked in the 251-300.

This demonstrates the impact of top researchers moving.

Beyond Australia, there are several universities which have moved up in THE WUR on the basis of a select group of researchers (not necessarily Highly Cited Researchers). For example, Alfaisal University ranked outside the top 500 in 2018, moved up to the 301-350 band the next year, and ranked in the 201-250 band last year. Again, improvement was driven by citations – from 11.8 weighted points in 2018 to 29.7 weighted points in 2022. What we observe with Alfaisal U is that their scores in teaching pillar improved from 4.8 weighted points in 2018 to 6.1 weighted points in 2022 while the research pillar weighted scores declined from 8.2 to 6.4 during the period from 2018 to 2022. Now, Alfaisal ranks in the 301-350, because of weaker scores in citations.

This year, we see King Abdulaziz University (KAU) moving from =190th last year to =101st. For the preceding four years’ editions 2021, KAU ranked in the 201-250 range.  KAU improvement is largely attributed to higher citation scores and to a lesser extent to the teaching and research pillars. KAU had 37 highly cited researchers in 2021 and they have largely contributed to the continued improvement to the FWCI.

Institutions can derive large amounts of success from a selected group of researchers, as long as they remain at the given institution and continue to produce (or co-author) papers which attract citations at rates far above everyone else on a given discipline.

parting thoughts

It is preferable if the improved performance of an institution rests on improvements spread across the various pillars (and metrics within each), rather than solely on a single measure like citations.

What we are seeing with the THE WUR is that a small number of outliers are rising to the top but are at the mercy of time due to the erosion of FWCI’s value or highly cited researchers moving on.

Over the long term, institutions which see an improvement in per capita measures (e.g., doctorate to bachelor completions, research income per staff), volume of scholarly outputs, proportion of papers in top quartile journals, and collaboration with industry, are more likely to ensure higher rankings.

Bear in mind that none of these global rankings measure student experience, student satisfaction, and measures of student access and success. If measures like these would be considered, it is highly certain that the order of top-ranking institutions would differ to what we see currently in any of the major schemas (i.e., QS, ShanghaiRankings and THE).

Angel Calderon is Principal Adviser, Policy and Research at RMIT


Subscribe

to get daily updates on what's happening in the world of Australian Higher Education