Mixed reviews for research metrics

The Deans of Arts, Social Sciences and Humanities have strong, some sceptical, views of ARC research metrics

DASSH’s submission to the Australian Research Council’s review of Excellence for Research in Australia, and Engagement and Impact points to;

problems in ERA peer review: “Our members have found that many fields are reliant on narrow pools of reviewers, which casts doubts on whether meaningful conclusions can be drawn from the exercises.”

why EI: “University researchers who collaborate with industry already had strong engagement with a pathway to impact. The EI exercise simply encouraged better documentation of the activities, and there are likely better incentives for researchers to increase their collaboration with end-users rather than a stocktaking exercise.”

Overall: “Members generally doubt that the evaluations are well understood by government, industry and community. There does not appear to be much interest in research quality in policymaking, only in short term instrumentalist research. In any case, many members feel the year to year changes and trajectories implied by these are confusing and often artefacts of the process, so the exercise does not provide assurance of maintenance of growth in quality.”

There is way more here.

The Australian Academy of Technology and Engineering suggests the metrics are a solid base to build on

“ERA has largely succeeded at achieving its objectives, and is now an accepted benchmark for understanding and measuring Australia’s research achievement at the discipline level … EI is an important compliment to the ERA, in capturing the impact of research and enggement with industry,” the AATE submission states.

The Academy proposes four enhancements

* ensuring end-users, “can effectively use and interrogate ERA and EI data”

*  using the metrics’ data “to further promote impact and engagement

* consider AI to automate collecting research outputs (including data) and making them open access

* Open science to promote transparency of ERA and EI and improve measurement of impact and engagement. (“The rapid digitisation of science, technology and innovation is also driving change, leading to the emergence of the new ‘Open Science’ paradigm.”)

The AATE submission is here.

Get the word out early

The ARC plans to release submissions to the review after it is out, which seems a bit late for a debate. So, CMM will report and/or link to, as many submissions as it can – send them in people.