Quality counts in computing metrics

The Australian Computing Research Alliance wants known its opposition to venue impact factors and rankings

“Venue rankings have limited value in comparing one research area with another, they do not discriminate specialist from generalist venues, nor the distinct values of different venues, and they often replicate information contained in standard bibliometric tools,” the alliance of ANU, Uni Melbourne, UNSW and Uni Sydney announced the other day.

So what brought that on, CMM asked Uni Sydney which issued the statement for the alliance.

“It was not in response to any disagreement in the discipline,” is the reply. “It simply reflects the commitment from the four computer science schools to not use metrics when evaluating performance in future and instead to focus on the impact and quality of the research.”

Good-o, although the Computing Research and Education Association of Australasia, appears to disagree – it has databases ranking conferences and journals.

So what’s the issue? Emery Berger ( U Massachusetts, Amherst), who runs a metrics-based computer science conference ranking  suggests, “while we might wish for a world without rankings, wishing will not make rankings go away.”

His ranking, “is intended to be both incentive-aligned (faculty already aim to publish at top venues) and difficult to game, since publishing in such conferences is difficult. It is admittedly bean-counting, but its intent is to ‘count the right beans.’ ”

So, which unis make it on his ANZ metrics 2011-22? Uni Sydney is first on all-disciplines, UNSW is second, Uni Melbourne is fourth and ANU sixth.