by ANGEL CALDERON 

The 2022 edition of the Good Universities Guide (GUG), released on 2 August, provides yet another lens by which to assess the performance of Australian universities over the past two years. The GUG does not produce an overall rating; instead, it produces 13 ratings at the institutional level.

The GUG ratings reflect the relative performance of Australian institutions in the domestic sphere, compared to the several world university rankings which emphasise the global nature of education.

The GUG only publishes five-star ratings. If an institution has not received a five-star rating, only its score is published. It is then up to readers to draw any conclusion they wish to infer from these results.

First, we will discuss some of this edition’s results compared to last year. Then, we will consider its history and evolution, along with some methodological aspects. In the final section we will consider some possibilities for development.

Stable results despite COVID-19

Despite the disruption caused by COVID-19, the GUG continues to show year-on-year stability. In this year’s edition, we see that

* 11 institutions remain unchanged in the number of five-star ratings compared to seven last year

* 13 institutions gained at least one five-star rating compared to 14 last year

* 16 institutions lost at least one five-star rating compared to 19 last year.

Although there have been no changes to methodology or rating categories, 193 institutions received five-star ratings at the overall level, compared to 217 last year. What we observe in this year’s edition is that the institutions that sit in the top 20 per cent are better defined than last year. There are also fewer institutions with tied scores than last year.

Last year’s top performer was Federation University (14 five-star ratings) followed by Bond University (with 13 five-star ratings). Federation University dropped nine five-star ratings, mostly in the metrics related to the educational experience.

This year’s top performer is Bond U with no change to its five-star ratings. There are three other institutions which received 11 five-star ratings: Edith Cowan (four more than last year), the University of New England (one more than last year) and Notre Dame University (unchanged compared to last year).

Aside from Federation U, the other universities which dropped the most five-star ratings are: Queensland University of Technology (from nine to three), Griffith University (11 to seven), Deakin University (10 to six) and Wollongong University (eight to four).

Of the 13 ratings at the overall level, the metrics most affected were those derived from the Student Experience Survey (SES) – 21 fewer five-star ratings (113 compared to 134 last year). The ratings most affected were Learner Engagement, Overall Quality of the Educational Experience, and Skills Development.

The institutions which experienced a decline in these ratings were mostly in Victoria and Queensland and to a lesser extent in New South Wales. This is unsurprising once we consider that the SES was administered nationally in August 2020, when various types of restrictions prevailed across states and there was no firm prospect of a return to on-campus activities.

Next year’s results are likely to be further influenced by the current lockdowns and restrictions as the administration of the SES commences this August.

Field-level ratings

The GUG also contains ratings across 21 fields of study, which are grouped at undergraduate or postgraduate level. These fields are consistent with the study areas as reported in the Quality Indicators for Learning and Teaching (QILT).

The field of study comparison is designed to draw attention about institutional performance on eight key metrics on a field-by-field basis. Two of these are drawn from the Graduate Outcomes Survey (full-time employment and median salary), and the remaining six are drawn from the SES and refer to the educational experience.

At the undergraduate level, institutions received a total of 1,045 five-star ratings. The institutions which perform best for assessed subject areas are:

* Edith Cowan is assessed in 15 fields and received five-stars in every field totalling 72

* Uni New England is assessed in 16 fields and received five-stars in 14 fields, totalling 53

* Bond U is assessed in nine fields and received five stars in every field totalling 51.

By comparison, the institutions which performed less well are:

* University of Western Australia is assessed in 13 fields and received five stars in one field,

* University of Melbourne is assessed in 12 fields and received five stars in two fields.

 An impressive milestone

The GUG was established 30 years ago, well before the first international ranking (Asiaweek in 1999) or the first global ranking (the Academic Ranking of World Universities in 2003).

The GUG was founded by Dean Ashenden and Sandra Milligan, who remained editors until 2002. They began the GUG with the aim of providing information to prospective students in deciding which institution is best for them. Over the years there have been other domestic ratings and rankings, but these have not had the continuity of the GUG.

The first edition had 15 ratings, spread across four categories, which were designed to assess institutional standing in terms of prestige (including research activity); entry flexibility; the educational experience (such as student staff ratio); characteristics of the student population; and graduate outcomes. In each rating the “best” score was given as five and the “worst” as one.

The Good Education Group acquired the GUG from Hobsons in 2015 and undertook a review of the methodology. Effective from 2016, the GUG publishes only a five-star (see last year’s commentary CMM 2 September 2020).

The GUG has been edited by Ross White since 2009, following the retirement of Richard Evered who edited the ratings between 2003 and 2008. In the 2014 edition, the GUG introduced a new measure on student retention,  examining the ATAR scores of domestic school leavers who commenced a bachelor’s degree the following year.

The rating showed which universities retained students at a higher rate than the national average for students enrolled in the same field of study, with comparable ATAR scores (grouped in bands). In the instances where an institution had a retention rate that is considerably below the national average among students who achieved a high ATAR, it was more likely than not that those students transferred to another institution to study their course of choice. It also meant that for those students with high ATAR their expectations were not met during the first year at university.

The results of this analysis confirmed the belief that the higher the ATAR achieved by students, the higher the probability these students are retained by the institution through to a second year of undergraduate study. However, this was not a measure that was easily understood and probably disliked by academic and university administrators. The retention measure has not been published over the past two editions.

It may be a case that the Good Education Group needs to organise regular roundtable discussion or university forums to discuss ratings results, methodology and data issues. It may also be a case of developing more editorial content on the data that sit behind the ratings.

The ratings produced by the GUG can be reproduced as these are derived from publicly available data. This is something for which the Good Education Group is to be commended for.

Methodological improvements

One positive aspect about the GUG is that its methodology has remained unchanged for some time and does not suffer from methodological adjustments year after year.

There is room for the GUG to be further strengthened and continue to be an invaluable resource for prospective students and institutions. As a starting point It may be worthwhile for the reintroduction of the retention (or similar) measure as it is a useful way to gauge on students transition and progression.

It may also be useful to consider a measure on scholarship opportunities provided to students, particularly for those from disadvantaged backgrounds. For example, dollars spent on financial support to disadvantaged students as a proportion of total revenue.

The GUG does not have measures to gauge opportunities provided for students who are not school leavers. Admission requirements vary across universities for non-standard entry. There is so much improvement to be made on this domain and this is a space in which the GUG can play a pivotal role in informing prospective students but also ensuring there is greater transparency and fairness across the system.

As the GUG also has ratings at postgraduate level, there is scope for measure designed to captivate this audience. For example, it could introduce a measure on the percentage of the student cohort with relevant employment and professional experience. Another possibility is a measure on the postgraduate student cohort as a proportion of the undergraduate cohort.

Finally, as there is growing focus in the sustainable development goals (SDGs), the GUG could introduce a measure which gauges how universities are working to address the SDGs.

Angel Calderon is principal advisor, planning and research at RMIT


Subscribe

to get daily updates on what's happening in the world of Australian Higher Education