• Tue
  • Sep 23, 2014
  • Updated: 8:53pm
LIFE
LifestyleFamily & Education
EDUCATION

Darts and sciences

University rankings can lure international students, but some experts fear the league table is unreliable, writes Yojana Sharma

PUBLISHED : Monday, 15 April, 2013, 12:00am
UPDATED : Monday, 15 April, 2013, 10:09am

Students intending to study abroad are so used to looking up where universities stand in global league tables that it is easy to forget the rankings are just a decade old - the first was released by Shanghai Jiao Tong University in 2003, creating a huge international buzz.

Since then there has been a profusion of international rankings, mainly published by newspapers and magazines, such as Times Higher Education (THE) and The Guardian in England, and the US News and World Report in America, as well as commercial organisations, such as QS.

International rankings have become so popular that organisations vie against each other to release new lists every year or sub-lists with a tweak - THE's first Asian regional ranking was released on April 10 and QS is planning a Latin American ranking based on "methodology more appropriate to that region".

The main global league tables are timed to coincide with the university applications season - some students wait for the rankings before deciding where to apply.

But do the different rankings, which come out year after year, actually tell students what they want to know? And are they reliable?

Most global rankings "are quite limited in what they measure and thus provide only an incomplete perspective on higher education and on the universities that are ranked," says Philip Altbach, director of the Centre for International Higher Education at Boston College in the US. "The Shanghai rankings are quite clear in what is assessed - only research, research impact, and a few variables related to research, such as prizes awarded to professors and numbers of Nobel winners associated with the institution."

There is no such thing as a correct ranking ... they are the subjective views of the compilers
Phil Baty, Times Higher Education

So, for example, the London School of Economics is notably absent from the Shanghai rankings, and a Hong Kong student seeking an undergraduate place in design or architecture would not be well served by the lists.

Highly rated US liberal arts colleges also tend to be absent.

"Universities that do not have engineering or medicine are probably undercounted," says Altbach. "Universities that are strong in technology, life sciences and related fields have significant advantages."

The main reason is that research is easily quantifiable - measured in terms of publications in international journals, citations and patents. But even this is only a crude measure of university performance - some of the world's top scientists are not in universities but in research establishments, such as Germany's Max Planck Institutes - one reason German universities underperform in global rankings.

For some years, several Malaysian universities boycotted international rankings. They did not want to be judged by arbitrary criteria set by Western commercial rankings organisations. None the less, the Malaysian government quietly took note of what pushed universities into the top echelons, and realised it meant funding international research collaborations. They have taken part in rankings more recently, from a strengthened position.

Research does dominate, admits Phil Baty, rankings editor for THE, speaking at a recent conference in Dubai. "There is no such thing as a correct ranking - they are all subject to the subjective views of the compilers," he says unapologetically.

"We are aware of the fact that composite ranking scores are inherently crude" and they reduce "complex activities of universities to a single number," he adds.

Among the quantifiable criteria are teacher-student ratios, which favour big research universities; endowments, which favour the richest and oldest institutions, particularly the US Ivy League; and the number of foreign students and professors, which push some Australian universities higher in the rankings than comparable institutions in Japan. Even how visible a university is on the internet counts, which can be gauged using webometric data.

But aspects like teaching quality, graduate employability or student quality - measured by entry grades, which in turn depends on the quality of the school system - are less easy to measure and harder to compare across countries and cultures.

"The first thing everyone asks for is subject rankings. Students, for example, want to know what the best university in the world is for mechanical engineering," says Martin Ince, convener of the QS advisory board. QS published its first global subject rankings in 2011, two years after its first Asian ranking.

QS was one of the first to start regional rankings using slightly different weightings for Asia, including employability based on recruitment surveys and the numbers of exchange students.

Despite the tremendous focus on the institutions in the top 10 or 20 of world rankings, "a regional ranking that is not dominated by the Harvards and Oxfords is more relevant to students who want a quality international education but don't want to go around the world", says Ince.

Hong Kong University of Science and Technology topped the Asian ranking, even though the University of Hong Kong was the highest ranked Asian university in the QS global rankings the same year, which may well confuse rather than help students.

But even in Asia it is debatable whether universities in Afghanistan can be compared using the same criteria as universities in Hong Kong, Singapore or Japan.

"Oxford can be compared to Harvard and can definitely be compared to Cambridge, but can it be compared to Polish universities?" says Jan Sadlak a former head of a university in Poland, and now head of the International Observatory on Academic Ranking and Excellence, which is conducting an audit of three of the rankings systems to ensure they are measuring what they say they are measuring.

That rankings need an audit should ring alarm bells for students. A number of US universities have already been accused of "gaming" the rankings, by awarding more top class degrees or massaging the student-professor ratio.

Rankings organisations have been forced to provide some of the less easily available data students seek, but they often fill the statistical gaps with less rigorous surveys of employers, students and academics, and these have come under scrutiny.

"There is this issue of how credible is the survey, how you select your sample and the questions you ask. Verification is not easy," says Sadlak.

Drew Faust, president of Harvard University, which regularly tops international rankings, says: "Different rankings measure different aspects of universities. We always recognise these rankings are set by groups that have particular perspectives and they are not necessarily always consonant with our perspectives.

"When we think about what is important at Harvard we think of how well we are teaching our students, how much they are learning; how much are their experiences [here] contribute to giving them the kind of foundation that enables them to be effective and productive citizens in the world and lead meaningful lives; how much research are we accomplishing in the variety of schools across the university that contributes significantly to human knowledge - what are we doing for our community and our nation and the world."

Institutions at the top of the rankings tree may be confident that they are getting it right, but other universities have come under pressure from policymakers to become more like the top universities.

The so-called "reputation rankings", based on surveys of academics of who they think is the best, are particularly heavily criticised.

"What the reputation ranking is measuring is does the university have a world brand," says Frank Ziegele at the Centre for Higher Education in Gütersloh, Germany. The centre is involved in developing U-Multirank, a ranking system funded by the European Union and one which relies less on research performance.

Rankings are influential. But reputation rankings may be unduly so. According to Ziegele, simply being high on a reputation ranking can increase a university's reputation even more, regardless of quality or performance.

life@scmp.com

Share

For unlimited access to:

SCMP.com SCMP Tablet Edition SCMP Mobile Edition 10-year news archive
 
 

 

 
 
 
 
 

Login

SCMP.com Account

or