AUSTRALIA
bookmark

AUSTRALIA: First research exercise a mixed bag

Depending on who's doing the commenting, the research performance of Australian universities is either world-ranking or pretty damn miserable. The Excellence in Research for Australia report was released last Tuesday to a distinctly mixed reception, even though the assessment was a retrospective exercise and evaluated the performance of universities from 2003 to 2008.

Professor Margaret Sheil (pictured), Chief Executive of the Australian Research Council, which was the main agency for the ERA initiative, noted that the report offered for the first time a comprehensive evaluation of Australia's research achievements against those of its global peers.

Writing a commentary in The Australian newspaper, Sheil said: "The picture is impressive. In total, 65% of units were assessed as performing at world standard, including 21% above and 13% well above the rest of the world."

But the paper led its page one with a news story that declared: "More than two-thirds of Australia's universities have an overall research performance that doesn't reach international benchmarks, a finding that could have implications for more than A$1.5 billion in tertiary education funding..."

The Australian's higher education writers had done their own unusual analysis of the ERA report and, by averaging scores across the different subject areas, claimed that only 12 universities were performing research at or above international standard, with the top four performing at a rate that could be considered well above that level.

"The remaining 29 out of 41 higher education institutions have an average research performance that does not meet international benchmarks," the reporters said.

The first of a large number of respondents to comment on the story, a 'David of Adelaide', pointed to what he saw as a fundamental flaw in the analysis: "The ARC has just spent years finding a way to judge university research by field of research so we can see where the real discipline-based excellence is located," he wrote: "And The Australian chooses to average the scores over all fields of research. You absolute cretins..."

Disagreement also arose over which universities topped the latest league table. The University of Melbourne was quick off the mark with a self-congratulatory release declaring it had established itself "as the leading research university in Australia, topping the key indicators in the Excellence in Research for Australia report [which] shows Melbourne had the highest number of research disciplines ranked at the maximum possible - well above world standard," the release gushed.

"Of just over 100 research areas assessed by the ARC over a six-year period, 42 at Melbourne had the highest rating. Another 40 were rated above world standard and 20 at world standard; overall, 88 were above the national average."

The Australian National University, or ANU, which is funded separately by the federal government and has the biggest research base, had the next highest proportion of research above world standard at 79% followed by the University of Queensland, the University of New South Wales and the University of Sydney.

In its analysis, however, The Australian had the ANU at the top "with almost 70% of subjects rated at the highest possible level".

Whatever one's view of these differences in who outranks who, Sheil was right to note that the ERA report drew together rich information about discipline-specific research activity at each institution, "as well as information about each discipline's contribution to the national landscape".

And, as she also said, it was a huge exercise and took into account the work of 55,000 individuals, collecting data on 333,000 publications and research outputs across 157 disciplines. In all, 2,435 areas in 40 institutions were assessed by committees comprised of distinguished Australian and international researchers: "That is, those who know the field interpreted the data".

The committees had access to detailed metrics and a range of other indicators, including results of more detailed peer review of individual works held in online repositories.

Sheil also pointed to the fact that Australia has lagged behind its international counterparts in implementing a research evaluation system. South Africa has been running a ruler over its researchers for more than 20 years, the British first introduced theirs in 1986 and even New Zealand began one in 2003.

"Because of a long gestation, we have been able to use an Australian Bureau of Statistics classification system designed for Australasia and learn from problems elsewhere, consulting the best available expertise to assist in the design of the initiative, as well as using the latest advances in information tools and technology," she said.

As critics of these sorts of exercises have long pointed out, however, the winners nearly always come from older, high-status universities which deliberately attract top scholars who, in turn, win competitive research grants, especially in medicine and the sciences.

The Group of Eight research-intensive universities seemed to prove the truth of the old saying that "to he who has, shall be given" by claiming the top eight rankings - at least according to The Australian's curious calculations.

Sheil admits there is a strong correlation between 'excellence' and areas that have won competitive research funding. As she says, the traditional strength of medical science is not surprising given that medical researchers have a separate funding council, a history of strong leadership and many successes, including most of Australia's Nobel laureates.

Debate over the value of such large-scale exercises and who really beat who in these sorts of status stakes will continue. But the results of this assessment may have political and funding implications for all universities.

Releasing the report, federal Research Minister Senator Kim Carr said: "The ERA national report reveals for the first time exactly how our country's research efforts compare to the rest of the world. While we celebrate our successes, we must also acknowledge that we have areas where we could do better, and the government will use the report to identify ways to improve."

Last Thursday, however, Carr responded to the The Australian's negative coverage of the report and its controversial averaging method to draw up a league table of universities.

In a commentary published in the paper, he declared the assessment had shown Australian researchers had won the equivalent of "239 gold medals and that gold medal equivalents were won by 25 institutions...while 22 institutions had more than half of their areas performing at or above world standard".

"We cannot just add up or average all ERA scores to rank a university," Carr wrote in answer to the paper's page one news story giving its own analyses of the results. "Averaging the ERA ratings and saying one institution is better at research is the academic equivalent of averaging the number of top five places in [Olympics] swimming with those in tae kwon do..."

Another critic was the deputy vice-chancellor for research at Deakin University in Melbourne, Professor Lee Astheimer. She joined a growing chorus of criticisms of The Australian, saying the "quick and dirty analysis and the accompanying league table did not accurately reflect the significance of research in many of Australia's universities, including Deakin, that have strategically targeted resources towards research strengths".

"The Australian's analysis was trivial and simplistic," Astheimer said. "Most of our first year accountancy students would know that you don't just average averages, which is what The Australian did, ignoring the issues of weighted averages and research volume.

"Their faulty and ill-informed report has done a serious disservice to Australian universities. And, as Professor Margaret Sheil has already pointed out, ERA was never meant to be used to produce a league table and certainly not one as flawed as that produced by The Australian."

Astheimer also revealed the extent to which middle-ranking universities are striving to boost their research strengths - and their funding from government and competitive grant bodies - by filling dozens of new research-only positions.

"Since the last census date in March 2009, Deakin University has appointed over 200 research-only staff and a large number of excellent researchers into teaching-research positions," she said.

If they haven't already done so, other vice-chancellors and their deputies will be following suit very shortly.

geoff.maslen@uw-news.com

Comment:
The Excellence Research Branch of the Australian Research Council emailed the ERA Liaison Officers of all universities on the morning that ERA Report was released saying, in part, "We would like to remind you that ... The 'ERA 2010 Citation Benchmark Methology' remains a confidential document..."

Thus surely the only conclusion that can be safely drawn from the published data is that some Australian universities do more research in some subject areas than others! All the rest is speculation. Ho hum!!!

Adrian Gibbs

Comment:
It seems to me wonderfully ironic that the ERA while adopting a model based heavily on the mystical value of numbers and statistics, can so patently misapply its bibliometric methodology to the making of judgements about the information and computing sciences.

While on the one hand ERA had previously acknowledged the acceptability of publishing in conferences for fast moving disciplines with that publishing culture, such as computing, on the other hand it badges whole institutions as "below world standard" in Information and Computing Sciences, based on them not reaching critical mass in "SCOPUS-indexed" journal publications.

Its own data makes it clear that 75% of the reported computing publications were in conference papers so it is making judgements of quality based upon less than 25% of the output of that group of disciplines - given that SCOPUS does not index all computing journals either!

Rather than simply noting the exclusion of conference publications in the institutional analysis on page 271 it should have refrained from making any quality judgements for that set of disciplines on such a flawed basis.

"Even in New Zealand..." we have not treated our Information and Computing Sciences disciplines so shabbily in our Performance Based Research Fund ranking exercises.

Tony Clear
Associate Head of School, Computing and Mathematical Sciences & Associate Dean Research (acting), Faculty of Design and Creative Technologies, Auckland University of Technology, New Zealand