The much-anticipated E&I results were made public on Friday 29th March, offering engagement and impact supporters, nay-sayers and analysts new material to work with.

There are a few surprises in the new ‘league tables’. First, on sheer volume, which universities achieved the most high rankings in total across the three measures of engagement, approach and impact? (Here we see a combination of the familiar research juggernaut universities, and some of the industry-savvy tech universities.)

Table 1 (click to see enlarged version)

University of Sydney must surely be disappointed with this result—a wide-ranging research portfolio submitted, but proportionately small success in achieving high E&I rankings.

On that point, taking into account the actual number of research fields submitted, which institutions achieved the highest percentage of ‘high’ rankings? This, to my mind is the true measure of success, capturing those institutions that uniformly succeed in achieving great end-user engagement and impact across all of their research activities. Here, University of South Australia, and University of Technology Sydney must be thrilled with this result, with Monash, University of Queensland and University of New South Wales demonstrating both breadth and quality in engagement and impact.

Table 2 (click to see enlarged version)

A full table of results showing ranking results for all universities for the E&I exercise is also provided at the end of this article.

The next key question must surely be: was there a relationship between an institution’s ERA result, and their E&I result? When comparing the proportion of assessed fields rated ‘above’ or ‘well above’ world standard (2-digit) for any institution, with the proportion of ‘high’ rankings across engagement and impact achieved by that same institution, the result shows a significant positive relationship (r = 0.69, N = 39, p < 0.0001). Click on Figure 1 below to enlarge.

Figure 1

This perhaps isn’t a surprise, but does provide support to the notion that universities that produce world-class research are attractive to end-users for partnership and uptake of research outputs; and also arguably that research efforts undertaken with a deep understanding of the world around them, and how the research effects that world, may be more likely to be of high standard.

Some further interesting analysis by the Innovative Research Universities, and shared with the Campus Morning Mail team and HECG looked at the breakdown of which industries / socioeconomic objectives had the most ‘high impact’ assessments linked to them. Health has proven to be the area with most high impact projects, with culture, environment, law and education the next largest cluster. Click on Figure 2 below to enlarge.

Figure 2

Another interesting question was to examine the part played by funding initiatives that focus on the research/ industry nexus in achieving impact success. The Impact Studies Library published by the ARC https://dataportal.arc.gov.au/EI/Web/Impact/ImpactStudies offers the ability to explore how many highly-ranked case studies included reference to Cooperative Research Centres (19 individual case studies identified via the search term ‘CRC’ in abbreviated or full form), or ARC Linkage (10 case studies). It is important to note here that the Library contains only case studies that received a high rating, and had no sensitivities flagged. (The ARC has not published highly-rated case studies if any sensitivities were flagged, which is disappointing, as the submission process allowed for the specific sensitive material to be identified, and the publication process could surely have handled the avoidance of releasing confidential material, whilst still providing the case study summary). So, we can be reasonably sure that initiatives like the CRC, which are the powerhouses behind a lot of Australia’s commercialised and industry-relevant research successes have had an even stronger part to play than is suggested by the 19 case studies that the Library yields as related to CRCs.

It is difficult to get more granular about what the results are telling us. There is some evidence that lower-value, and small-scale research programs that generated a proportionately very high end-value (commercial or otherwise) could be judged as highly successful, rather than only the juggernaut programs (lots of dollars in, and equivalently lots of value created).

And again, whilst we have not had comprehensive line of sight to case studies submitted, the experience and involvement of HECG with our clients suggests that, when looking at the ‘Approach to impact’ results, those case studies that contained evidence against all or most of the suggested approach criteria listed by the ARC were more likely to be successfully highly graded.

So, was the E&I exercise worthwhile? Will it help our national learning about what does and doesn’t work in achieving high-quality impact in our research whilst engaging effectively with industry end-users?

First, it is my belief that the universities who took this exercise very seriously and invested significant time and effort in compiling a comprehensive evidence-base, and compelling impact narratives, have benefitted by achieving good results. There is no doubt that capturing and reporting impact and engagement well is a costly exercise. For example, one study investigating the UK’s experience of the Research Excellence Framework evaluation suggests an estimated cost to the UK universities of £7,500 per impact case study produced and £4,500 for an impact template describing the unit’s strategy to facilitating impact (the two together operating as a proxy equivalent of our E&I submissions), and an estimated time estimate ranging from 8 to 30 days to produce a case study. One Australian university estimated its portfolio of around 20 impact case studies took nearly 6,000 hours of effort to create, the combined input of dozens of university personnel, industry partners and other contributors.

At this stage, no one knows precisely how (or how much) the engagement and impact assessment exercise might matter for the future, in terms of dollars, policies and new initiatives. Labor’s Senator Carr has made quite disparaging public comments about the exercise, putting its survival under a Labor leadership post-election under a cloud. However, it is clear that the emphasis on achieving end-user value from public research investment, and increasingly attracting industry investment into our nation’s research efforts, is here to stay.

Forward-thinking universities should be minded not to sit and await developments, but to drive and shape them. Universities who stand to potentially benefit the most from new funding flows that reward engagement and impact should be busy now, informing themselves on their respective returns-on-investment achieved, and identifying how and where strategic government investment can leverage even further-improved outcomes for taxpayers.

One of the most exciting opportunities arising from the creation of the engagement narratives and impact case studies is how they can be repurposed to inspire, inform and celebrate. The publicly-released high-scoring case studies comprise the beginnings of a national library of inspiration and achievement for our research sector.

These success stories can help a potential industry partner understand what the road to impact looks like when working with universities; they should be broadcast to all levels of government, to demonstrate how powerful and effective public policy can be developed, when it is informed by rigorous research and expert advice; and they should inspire early-career researchers (and for that matter our schoolchildren) to imagine a fulfilling life making a difference through research, in partnership with others.

Dr Susie Robinson is the CEO of Higher Education Consulting Group (HECG), offering a range of advisory and implementation services within the higher education sector. We are proud to have co-developed impact case studies with university clients who secured outstanding E&I results. The HECG team can be contacted at [email protected].

Click the link below for full ranking results for all universities
Table 3 Full ranking results


Subscribe

to get daily updates on what's happening in the world of Australian Higher Education