by JAMES GUTHRIE and BRENDAN O’CONNELL
We have come to that time again in the Australian public university life cycle, where staff are busily putting together their submissions for the periodic research evaluation, known as the Excellence in Research for Australia (ERA). While the results of this exercise do not directly impact university finances (unlike in the UK), they are taken seriously by our vice-chancellors as prestige indicators. Consequently, university executives continually set ambitious targets to improve their scores for each discipline.
Many academics seem to regard ERA evaluations with trepidation as the benefits from a good score for their discipline are unclear and possibly non-existent. In contrast, the costs of not meeting the targets are potentially severe or even allegedly unfair, including removing internal research allocations, reduced allocations for research in their workloads, and performance management of academics viewed as underperforming in research [1].
Despite the consequences flowing from the ERA results, there appears to be evidence of concern about the accuracy and appropriateness of the current assessment process, especially for those disciplines where peer-review rather than metrics are used to make these assessments. These concerns are manifested in several recent articles appearing in CMM, see, for instance, commentaries by Sean Brawley, James Guthrie and John Dumay and Garry Carnegie .
These authors’ scepticism of the ERA peer-review process comes from their observation of an apparent disconnect between increases in the quantity and quality of research outputs, including publications and income, and the score awarded within their disciplines.
In this contribution, we provide two case examples drawn from the accounting discipline that raise further concerns about the disconnect between research outputs and quality and the scores awarded through the peer review process of the ERA. They are drawn from a paper that one of the present authors published in the Accounting Auditing and Accountability Journal [2].
In this study, the key research question examined how the outputs and foci of research in elite accounting disciplines changed over 16 years (2004 to 2019). We analysed all papers published in 20 highly ranked accounting journals (defined as A* or A ranked under the Australian Business Deans Council journal rankings) across this 16-year period by Australian accounting academics that were highly rated (three or better) in the research assessment exercise. We also compared our results from this group against the same publication list of two universities, as case studies in the accounting disciplines, that were not rated as “world class” (meaning that they received an ERA score below three).
Our primary research findings can be summarised as follows:
* the results exhibit little change across the four assessments for the accounting discipline, with between 11 and 13 universities rated three or above across the four ERA assessments. The vast majority of these were the same universities.
* our two case study universities (the Macquarie and RMIT accounting schools), which received scores of two across all four rounds, compared favourably with universities rated higher. Specifically, compared to the higher-rated universities, the two case study universities would have ranked in the middle of total outputs for the period in the same sample of journals (i.e., in seventh and sixth places, respectively).
* research outputs increased significantly across the period for the two case study universities, but their ratings did not change.
* We conducted further analysis of the disciplines to assess research productivity per capita of Macquarie and RMIT against the higher-rated universities sample. This placed them in the higher end of per capita productivity (fifth and fourth respectively) for the same set of journals compared to the higher-ranked universities.
In looking for possible explanations for our findings, further analysis was conducted to ascertain what else might explain this apparent anomaly over time. We discovered that the two case study disciplines differed markedly from the highly-rated disciplines regarding the type of research that dominates their outputs. Specifically, while publishing in the same journals, the highly-rated universities were more likely to employ quantitative and positivist research approaches than the two lower-rated case studies.
Specifically, the percentage of quantitative and positivist papers ranged between 57 per cent and 88 per cent for the highly-rated group, whereas the equivalent for the two case studies was 41 per cent for Macquarie and just 5 per cent for RMIT. Instead, the foci of the research adopted by the two lower-rated universities were on use of sociological, interpretative and critical tradition qualitative approaches in their research.
Our findings also showed that both the case study universities adopted a more eclectic set of research topics than the highly-rated disciplines. The highly-rated universities tended to focus on traditional areas of accounting research, such as financial accounting, managerial accounting, and auditing. In contrast, the two case study universities published in social and environmental accounting, higher education accounting, performance management control, business ethics, accounting history, accounting education and professionalisation.
So, what can we make of these findings?
In summary, the ERA peer-review process has a strong bias in accounting towards awarding high ratings to those sandstone universities that conduct quantitative and positivist research in mainstream research topics, in many cases using North American data. Also, ERA scores are “sticky”, and significant increases in the quality and quantity of research outputs will not necessarily improve a university’s scores.
These results are particularly attractive to consider, given that The Australian newspaper rated RMIT University as “the top publishing Australian school in the Accounting and Taxation Field” in 2019 and awarded one of it’s professors as the top accounting professor [3]. These awards by The Australian were based on research metrics, not peer review. Two of the professors at Macquarie U are also rated as amongst the highest cited in the country in the accounting field [4].
This contribution is not a complaint about how the ERA appeared to get our ratings amiss. Rather, it shows how the peer review practices used to judge the performance of particular academics and their publications are at least, potentially biased and fragile. The situation is not helped by a strong lack of transparency in the ERA process, whereby there is no need to justify scores for universities externally. The ARC’s practice is to release a report at the end of each round, but this only provides aggregated data for each discipline and the awarded scores for each university discipline. No rationale is provided for calculating that score for a given university. In our perspective, the process is akin to a student submitting an assignment, and the only feedback they receive is a mark without any written comments and a report about how the class performed. What University condones or supports this behaviour?
The authors do not tolerate such a process in assessing our students but it is evidently accepted in a formal process with considerable implications for each discipline within our public universities. Where is the transparency in the accountable and apparently desirable move to metrics to escape potential peer bias and restore faith in the research assessment system? Research evaluation in Australia surely, can do better than this.
Emeritus Professor James Guthrie AM, Professor of Accounting, Macquarie Business School
Professor Brendan O’Connell, Honorary Professor of Accounting, School of Accounting Information Systems and Supply Chain, RMIT University
1. Martin-Sardesai, A., Guthrie J. and Parker, L. (2020), “The neoliberal reality of higher education in Australia: how accountingisation is corporatising knowledge”, Meditari Accountancy Research, https://doi.org/10.1108/MEDAR-10-2019-0598
2 O’Connell, B., De Lange, P., Sangster, A and Stoner, G., (2020). “Impact of Research Assessment Exercises on Research Approaches and Foci of Accounting Disciplines in Australia”, Accounting Auditing and Accountability Journal, Vol. 33, No. 6, pp. 1277-1302 https://doi.org/10.1108/AAAJ-12-2019-4293
3 Guthrie, J. Parker, L. Dumay, J. and Milne, M. (2019), “What Counts for Quality in Interdisciplinary Accounting Research: A Critical Review and Reflections”, Accounting, Auditing & Accountability Journal, Vol. 31 No. 1, pp. 2-25.
4 The Australian (2019). “Business, Economics & Management: Australia’s Research Field Leaders”, 25 September.
5..https://scholar.google.com.au/citations?view_op=search_authors&hl=en&mauthors=accounting+.au&btnG