On 8 March, the current issue of the QS World University Rankings by Subject was released. This best-of list is compiled based on the comparison of numerous academic subject areas maintained by the evaluated universities.

Since 2016, publisher QS has included the performing arts field. And in that first year, the mdw managed to place 2nd among 100 arts universities worldwide, while this year has it in 5th place. These results make the mdw the best-ranked Austrian university in its field. It certainly is cause to be proud that an Austrian university has been ranked so highly in comparison to so many international competitors, and one sincerely hopes that the mdw will continue to experience this kind of success. But despite all the happiness about this great result, one shouldn’t fail to note several important limitations of such rankings.

International university rankings generate consider-able mass-media resonance by seeming to provide a simple answer to the (actually quite complex) question of which universities are the world’s “best”. And it’s all too eagerly that their results are taken to be reliable indicators of universities’ academic performance and international standing relative to one another. Being highly ranked is viewed as a seal of quality for degrees as well as research output. So far, these lists’ importance to prospective students choosing a programme of study is still rather minor in Austria compared with how it is in the Anglo-American world and in Asia. But policy initiatives such as the Austrian Federal Chancellor’s recent “Plan A”, which seeks to have at least three Austrian universities place among the worldwide top 100, show that rankings are indeed taken seriously here at the highest political level.

Even so, rankings depict but a very small cross-section of the broad spectrum of qualities that come together to make a university what it is. Those who compile them, after all, are attempting to describe complex institutions using just a few quantitative indicators. In the case of QS, these are derived largely from surveys among academics and employers. Lots of information gets lost here—including the dimension of teaching as well as the effects of a university on the society in which it is embedded. Rankings structurally underrepresent such factors.

Above and beyond the oft-voiced criticism of their methods, there is the fact that rankings compare institutions that simply cannot be compared with each other. Small and specialised universities, for example, get compared with large institutions that cover a broad range of subjects. Private universities financed largely by tuition fees are lumped together with publicly financed ones. And country-specific circumstances, such as Austria’s open-access policy, are ignored. All this amounts to a competition for the best rank under highly unfair conditions—like if a race car from a cash-flush factory team were to compete in the same event with a commuter bus.

The publishers who compile these rankings are reacting to such criticism, at least to an extent. The trend is now toward more highly detailed and differentiated lists that rank specific majors, universities within specific geographic regions, and newer universities. So in subject-based rankings such as the QS Ranking by Subject, the results are a bit less blurry. Such a list refrains from throwing all of university’s subject areas, with their respective strengths and weaknesses, into the same pot. But a critical attitude toward international rankings and their frequently error-prone methods still is, in general, appropriate. One should also pay attention only to the most current ranking results and refrain from making comparisons with past scores, because the editors frequently change their methods of calculating the various indicators, often without any statements to that effect. Such changes in methodology often cause year-to-year fluctuations in rank that have nothing to do with an institution’s actual academic performance. So it is in this light that one should view the mdw’s current ranking.

The point is, in any case, to be circumspect and exercise some perspective in one’s interpretation of such results. And here, the universities are receiving some support. A working group set up by uniko is now dealing in-depth with this topic, and they recently published a handbook as well as launched a website about dealing properly with rankings.

 

Johannes Sorz

Johannes Sorz, of the University of Vienna’s Office of the Rectorate, is the author of the uniko handbook on dealing properly with university rankings mentioned here. For further information on this topic, see: www.universityrankings.at

Leave a Reply

Your email address will not be published. Required fields are marked *