GLOBAL
bookmark

Don't take too much notice of rankings

Indian Prime Minister Manmohan Singh recently chastised Indian universities for having no institutions in the ‘top 200’ of the global higher education rankings. He sees this poor showing as an indication of the low quality of Indian higher education.

Indian authorities also said that only overseas universities in the global ‘top 500’ would be permitted to establish a branch campus or joint-degree programme in India.

Other countries use the global rankings for internal purposes. Singapore uses them as a benchmark and as an indicator of where scholarship students may be sent.

Russia has bemoaned its poor showing, has provided extra funding for selected universities, and is considering major additional resources for a few universities – in order to ensure that several will be in the top ranks soon.

Kazakhstan is committed to having a university in the top tier and looks to rankings as a guideline. At least one American university president has been offered a salary bonus if his university improves its rank. The list goes on.

Anatomy and critique

There are, of course, many rankings. Most are national and some are specialised. The majority are sponsored by magazines and other for-profit organisations. Many, if not most, are worthless, because their methodologies are flawed or there is no methodology at all.

Prime Minister Singh and most of the countries mentioned here refer to the three well-known international rankings. Two of these, the Academic Ranking of World Universities, popularly known as the ‘Shanghai rankings’, and the World University Rankings of Times Higher Education are methodologically respectable and can be taken seriously.

But these rankings are quite limited in what they measure and thus provide only an incomplete perspective on higher education and on the universities that are ranked.

The Shanghai rankings are quite clear in what is assessed – only research, research impact and a few variables related to research such as prizes awarded to professors and numbers of Nobel winners associated with the institution.

Times Higher Education measures a wider array of variables. Research and its impact are at the top of the list, but reputation is also included as are several other variables such as teaching quality and internationalistion. But since there is no real way to measure teaching or internationalisation, weak proxies are used.

Reputation is perhaps the most controversial element in most of the national and global rankings. Even asking a selected group of academics and university leaders for their opinions about which universities are best yields questionable results.

How much will physicists in Bulgaria or university rectors in Germany know about the quality of universities in India or Russia? It is not surprising, therefore, that only the Indian Institutes of Technology are ranked. They are among the few Indian institutions receiving international attention.

In general, the more reputation is used as a key variable, the less accurate a ranking is likely to be. Further, respondents filling out reputational surveys for rankings will judge an institution on its research reputation – teaching excellence, national relevance or university-university linkages are not part of the knowledge base.

In addition, certain kinds of research receive the greatest attention – the research that appears in recognised international refereed journals. The journals that are chosen for inclusion in the Web of Science, Science Citation Index, Social Science Citation Index and a few others are considered ‘legitimate’.

This limitation dramatically privileges publication in English, the language of the vast majority of internationally recognised journals. Further, research that adheres to the norms and values of editors and peer reviewers, who are mainly in the top Western universities, will tend to get published.

The hard sciences receive much more attention than soft fields such as the arts and humanities. Universities that are strong in technology, life sciences and related fields have significant advantages.

Distortions

Many outstanding institutions worldwide do not appear in the rankings because they do not happen to fit into the specific criteria measured. In general, specialised universities, other than those in technology, do not do well.

America’s elite liberal arts colleges, by most accounts offering some of the best quality education in the world, are nowhere to be found. Universities that do not have engineering or medicine are probably undercounted.

Most important, perhaps, are the disadvantages faced by developing and emerging economies. Researchers do not have easy access to the top journals, must write in English and, perhaps most important, the topics and the methodologies of the research must be appealing to editors and reviewers in the central academic powers.

The usefulness of rankings

To an extent, the rankings provide a way of benchmarking for the small number of research universities worldwide. By looking carefully at the structures, governance, funding and academic cultures of the universities that do well in the rankings, lessons can be learned.

Even though the budgets of the research superpowers can seldom be matched and the access of these institutions to top international talent will be impossible for most, there are global academic practices that may yield insights.

For India, or other developing countries, to obsess about the rankings is a mistake. There may be lessons, but not rules.

It is much more important that a balanced and differentiated academic system emerge, and as part of such a system there may be a few universities that can aspire to the middle or even the upper reaches of the ranking in time.

To limit academic cooperation to those universities that are listed in the global rankings is also a mistake, since many outstanding institutions do not fit the rankings model but nonetheless may be excellent global partners.

When it comes to universities, one size does not fit all. The global rankings measure just one kind of academic excellence, and even here the tools of measurement are far from perfect.

* Philip G Altbach is J Donald Monan, SJ University Professor and director of the Center for International Higher Education at Boston College in the United States. This article was first published in The Hindu.