GLOBAL
bookmark

Do we need to measure social and environmental impact?

Global university rankings have come a long way since the publication of the first Academic Ranking of World Universities, usually known as the Shanghai Ranking, back in 2003.

In addition to standard global rankings of research and teaching, we have seen an array of subject, age group, specialist and regional rankings. There is also a growing interest in measuring the social or environmental impact of universities.

It seems that universities are no longer expected to just provide instruction in academic subjects, to prepare students for professional and technical careers and to do research and innovation. They are now supposed to provide solutions to a vast array of social and economic issues, including, but not limited to, racial and gender equality, social justice and even the future of life on the planet.

The leading commercial rankers now seem eager to cater to these demands with the publication of university rankings that purport to measure universities’ commitment or contribution to sustainability.

GreenMetric and THE Impact Rankings

The pioneer of sustainability ranking was the UI GreenMetric ranking from Universitas Indonesia, which was first published in 2010. This has a fairly straightforward methodology with six indicators: infrastructure, energy and climate change, waste, transportation, water, and education and research. It is reliant on data submitted by institutions and participation is optional.

In comparison to the Times Higher Education (THE) Impact Rankings it has unfairly received little attention and it is largely ignored in East Asia. Last year, no Chinese universities were ranked and only one from Japan, although Ukraine, Kazakhstan, Iraq and Indonesia were well represented.

Nine years later, THE announced the first edition of its Impact Rankings which were explicitly linked to the United Nations’ Sustainable Development Goals (SDGs). These were also optional and used institutional data, although there was also a substantial bibliometric element in each indicator.

It should be clear now that collecting data from institutions can be problematic. Even if everybody concerned is honest, there are many ways in which data can be distorted as it makes its way from institutes, departments and remote branches to be processed by the rankers, eventually emerging as misleadingly exact numbers.

The plausibility of some of the ranks in the Impact Rankings is further undermined by THE’s selecting the top three indicators for each university for the overall score, along with a mandatory partnership indicator, so that universities can achieve a high overall score by emphasising a few of the SDGs and ignoring the others.

The rapid expansion of these rankings, from 467 institutions in 2019 to 1,410 this year, combined with various tweaks and adjustments, means that they have been remarkably volatile. This year the University of Sydney has fallen from second place overall to 52nd, the Hokkaido University has risen from the 101-200 band to 10th place and Western University from 52nd to third.

This sort of volatility casts serious doubt on the credibility of the whole enterprise. It does, however, seem that THE has found a very marketable formula and that many universities around the world are attracted by the idea of getting one or two scores that can be used for advertising or public relations.

Just a couple of hours before writing this, I noticed a big sign on the campus of the University of Leicester proclaiming that it was fourth in the United Kingdom for impact.

QS Rankings: Sustainability 2023

Now QS has boarded the sustainability train. Already they have been publishing ESG ratings as an adjunct to their world rankings. The QS Sustainability Rankings, the first edition of which has just been announced, are less inclusive than the THE or GreenMetric rankings.

Universities need to be included in the QS World University Rankings, to produce a minimum level of research on sustainability-related topics, and have a clear climate action plan. Consequently, there are 700 universities ranked, which is about half of the number included in THE’s Impact Rankings.

The rankings have two sections. Environmental impact includes three ‘lenses’, sustainable institutions, sustainable education and sustainable research. Social impact covers equality, knowledge exchange and educational citizenship, impact of education, employment and opportunities, and quality of life.

QS uses data from a variety of sources for these rankings. Some data, such as faculty and gender ratio, are submitted by universities. Other sources include responses to its academic and employer reputation surveys and statistics from various government agencies.

This procedure is likely to lead to complaints of ‘data scraping’ and might be compared unfavourably with the alleged rigour of THE’s approach. But given the problems of institutional data collection in many countries, this approach does have something to recommend it. Also, it reduces the amount of work that universities are required to do for the ranking agencies.

However, there is a lot of national level data here such as UNESCO statistics on student mobility or OECD data on gender and pay and income inequality. In effect, it seems that universities are rewarded not for anything they do or do not do but for being located in a country that does or does not do those things.

Western bias

The published results of the new QS rankings show a strong bias towards Western universities. The United States, United Kingdom and Australia are there in disproportionate numbers while the number of Japanese, Egyptian, Indian, Iranian, Pakistani and Russian universities is much lower than in the THE rankings.

There are 37 Chinese universities compared to 13 in the most recent THE Impact Rankings and none in the GreenMetric rankings, but they are much lower down the ladder than in the QS World University Rankings. Peking, Tsinghua and Fudan universities are 12th, 14th and 34th respectively in the world rankings but in the sustainability rankings they are 118th, 171-180 and 261-280.

It seems one function of these rankings is to cover up the rapid advance of China, and perhaps a few other Asian countries, in research in the natural sciences and engineering.

Some of the metrics show signs of a distinct Western bias. One example is the impact of education indicator, which counts academic reputation scores in “education, politics, social policy, law, art & design”, which “were felt most keenly to align with the social progress and impact that a good education can advance”. One wonders exactly who was feeling this so keenly.

While the rankings generally favour English-speaking countries, within those countries they do show that there might be areas where the elite is falling behind rising stars. Here, University of California, Davis is ahead of Harvard, Edinburgh and Glasgow are ahead of Oxford, and University College Cork and the National University of Ireland Galway ahead of Trinity College Dublin.

Rankings have justifiably come in for a great deal of criticism. While it is not a bad idea to look at university quality through different lenses, it might be advisable for the popular rankers to focus more on the fair and accurate assessment of the core missions of teaching and research.

It is hard to avoid the suspicion that products like the QS Sustainability Rankings are providing compensation for the failure of Western universities to meet the scientific challenge from China.

Richard Holmes is editor of the University Ranking Watch blog.