by ANGEL CALDERON

The eighth edition of the Best Global Universities by U.S. News & World Report, released on 26 October, marks the end of this year’s ranking season, in which Australian universities continued to perform well across all schemas; bearing in mind that it is too premature to see the effect of the pandemic on global rankings.

This year’s edition includes 1,750 universities across 91 countries compared to 1,499 universities last year. This represents 9 percent of the world universities in 2021 in relation to 7.7 percent last year.

U.S. News & World Report has been publishing rankings for more than 30 years in the United States but it was a late starter to the world university rankings. The inaugural edition of the Best Global Universities in 2014 included 500 universities from 49 different countries.

Let us first explore the methodological construct of this ranking before we assess why our universities performed as they did. I also discuss the overall usability of this ranking, and some suggestions to improve their accuracy.

Source of data

The information used to construct this ranking is entirely sourced from Clarivate. Bibliometric information is taken from the Web of Science whilst the two reputation indicators are drawn from the Academic Reputation Survey which is conducted annually among a selected group of researchers who have their research published in outputs indexed by Clarivate.

Clarivate’s data is also used to generate various indicators in the Academic Ranking of World Universities (ARWU), CWTS Leiden Ranking, the Moscow-based RUR rankings and some others. QS and Times Higher Education (THE) use Elsevier to draw the bibliometric required to construct their rankings. THE also uses Elsevier’s databases to draw lists of researchers invited to respond to its academic reputation survey.

Information about the number of students and academic staff which is used as context for the ranking is also taken from Clarivate’s Global Institutional Profiles Project. Institutions do not have any direct interaction with U.S. News & World Report at all.

Bibliometric heavy

Eleven out of the 13 indicators used to construct this ranking are bibliometric based, accounting for 75 per cent of the overall score. Of these, three are about publications: total number of outputs, which include articles, reviews and notes (10 per cent weight); books and conference proceedings each account for 2.5 per cent of the overall score.

Measures about citation impact weigh 50 per cent of the overall score. Six indicators are in this category, which include total normalised citation impact (10 per cent), number and percentage of publications that are among the top ten per cent most cited (22.5 per cent), total citations (7.5 per cent), and two indicators which are focused on the number and per centage of publications and highly cited papers among the top one globally.

There are two indicators focused on measuring international collaboration, accounting for 10 per cent of the overall score.

Research reputation

The last two indicators are all about the perceived research reputation of institutions as rated by academics who responded to Clarivate’s academic reputation survey. Each of these weighs 12.5 per cent of the overall score. One indicator reflects on the research standing of an institution in the global stage and the other standing in their world region.

The total number of responses to the reputation survey was 26 521, which combines responses collected between 2017 and 2021. This represents about 0.4 per cent of the number of researchers globally. This year’s edition has 12 per cent fewer responses to the 30 201 considered for last year’s edition of the Best Global Rankings.

Readers need to be aware that the number of responses to Clarivate’s survey vary from year to year. For example, Clarivate recorded 3702 responses to its 2021 reputation survey compared to 7712 responses last year and 6307 in 2019. In 2021, 32 per cent of respondents said that the North America region has the greatest familiarity in terms of higher education and academic research, 24 per cent said they were most familiar with Europe, and 26 percent were most familiar with Asia and the Middle East. Only 6 per cent of respondents said they were most familiar with Oceania. This breakdown helps demonstrate geography shapes the world of global rankings. It also tells us that the perceived reputation of our universities rests on the responses of a select group of academics.

Lagging performance data

One of the many criticisms of rankings is that they reflect past performance. This is a ranking with such lagging performance data, so it can be unclear which of the actions taken by universities had an impact on those outcomes. The bibliometric data used to inform this year’s Best Global 2022 edition covers a five-year period from 2015 to 2019. Thus, there is not any output included in the ranking which would reflect research published over the pandemic period.

Two indicators (number of highly cited papers and percentage of total publications among the top 1 per cent most highly cited) are based on the most recent 10 years of publications. No details are provided on which years are covered by this period.

Data from the reputation survey, as noted above, includes responses collected between April and June of this year, and is combined with the responses gathered from 2017 to 2020.

 Standing of Australian universities

There are 39 Australian universities included in the ranking, of which 25 are included in the world’s top 500.

The University of Melbourne remains Australia’s highest ranked institution at 25th, followed by the University of Sydney (28th, down one place from 21st last year) and the University of Queensland (36th, unchanged compared to last year).

Of the Australian universities included in the top 500, the fastest one-year improvers were Australian Catholic University, Swinburne University, La Trobe University, RMIT University, University of Technology Sydney, and Western Sydney University. These six institutions have also been the fastest five-year improvers. This serves to illustrate Australian universities are actively engaged in global rankings and have invested resources in increasing their research output and impact in the world stage.

The improvement of Australian universities in this ranking is driven largely by higher publication rates, which has included researchers targeting top quality journals in key subject areas for publication. It has also been driven by increased rates of Australian outputs being cited, although Australian universities are lagging other countries in specialised and emerging scientific fields.

Revenue generated from the recruitment of international students has greatly contributed to Australia’s research endeavours. Given that Australia’s international borders have been closed over the past twenty months, the performance of Australian universities are likely to be adversely affected in global rankings.

We eagerly await the next edition of the CWTS Leiden Ranking (next May) and QS World University Rankings (next June), as we are likely to see some institutions move downwards and some upwards. Our future performance in rankings depends on the type of resourcing and staffing choices made over the past two years because of restructuring.

As more Asian countries continue to invest more in research and development, Australian universities are likely to see increased competition to attract, retain and nurture academic talent. Increased investment in research and research training is therefore needed to remain competitive. Over the next few years, strategic decisions will need to be made in which disciplines and cross cutting technologies to invest in and the best ways to make it possible.

Usability of this ranking

As we have said previously, rankings appeal to different audiences. The Best Global Universities Rankings is more like a consumer’s guide. It is targeted to prospective students who are focused on a specific location. One must also pay to retrieve more information from what is displayed on the public website.

It requires time and effort to compile a list of the universities in the world’s top 100, top 200 et al, or explore how institutions perform on a specific indicator. There is also no way to see how institutions have performed over the years. Previous years’ results disappear when the latest rankings are released. All other ranking schemas let you browse previous years’ rankings.

Academics, institutional researchers, and strategic planners may find more it more enticing to access the data by using Clarivate’s InCites, which is a tool that enable users to analyse institutional data across a wide range of research metrics. InCites also includes data collected by Clarivate as part of the Global Institutional Profiles project. InCites is also a paid institutional subscription. In any case, those interested in diving more into the data that inform this ranking without access to Clarivate’s InCites are recommended to download the CWTS Leiden Ranking 2021 datafile, which contains all kind of bibliometrics data, including historical performance.

I would say the usability of the Best Global Universities Ranking is limited. It is a ranking which tries to be comprehensive and heterogenous and yet it leaves many dimensions out. It also reinforces the view that many universities and national systems outside the North America & West Europe region and rising Asia are in the periphery of global rankings.

University rankings have been developed using their own individual methodologies, evaluating in some instances the same kind of indicators from different perspectives.  It seems that the time is coming to create and translate the meaning of different rankings and to explore the way to develop an overall ranking evaluation frame similar to a metadata analysis. It may also mean rescinding of unhelpful or proxy indicators of quality. These actions would help to develop a more in depth and comprehensive meaning on universities present performance and to develop the tools to evaluate the changes that individual universities are implementing right now that may impact future outcomes.

Angel Calderon is principal advisor, planning and research at RMIT


Subscribe

to get daily updates on what's happening in the world of Australian Higher Education