Recipe for a ranking

University rankings are considered evidence of the level of excellence of each institute of higher education. But what are the rankings based on?

Whenever new ranking lists are published, universities are energised. Our university is this high on the list of the world’s top universities! Did we go up or down in the ranking?

The rankings are big news, they interest the media and they influence the way people think about universities, whether we want it or not. There is less talk about what the rankings actually measure.

The most well-known and oldest university rankings are the ones produced by the Shanghai Jiao Tong University, the Times Higher Education magazine and the QS company. These are not easily mutually comparable. The only element they share is research citations, which contribute between 20% and 30% to each university’s ranking.

The Shanghai list rewards universities for Nobel laureates and Fields Medallists, and emphasises the natural and medical sciences. Meanwhile, Times Higher Education and QS rely heavily on interviews surveying the general reputation of each university. They also award points for the international outlook of universities and monitor how many teachers they have in relation to students.

A newer and less-known listing is the National Taiwan University ranking. It is based exclusively on research success and is consequently particularly suited for comparing research-focused universities.

FOR WHOM AND WHY?

Who are the university rankings intended for? The first global university ranking, the Academic Ranking of World Universities, compiled by the Shanghai Jiao Tong University, was launched in 2003. China wanted to send its students abroad and needed to know what the best destinations would be. China was especially interested in the hard sciences and medicine, which is reflected in the results of the ranking: universities without particular competence in these fields can only dream of the top positions in the Shanghai ranking.

Since then, ranking lists have become interesting to students from other countries as well.

 “International student surveys have found that students choosing universities primarily refer to ranking lists. But at the end of the day, information from peer groups is more important,” says Markku Javanainen, advisor at the University of Helsinki’s Institutional Research and Analysis.

 “Rankings are also useful for the universities themselves. It’s good to know where you stand in the international arena.”

The Shanghai Ranking (ARWU)

Published by the Shanghai Jiao Tong University since 2003.

The University of Helsinki’s latest ranking: 56th in 2017.
 

Awards 30%:
Points are awarded to universities where Nobel laureates and Fields Medallists study and work. The University of Helsinki still benefits from A.I. Virtanen’s Nobel prize in 1945 and Lars Ahlfors’ Fields Medal in 1936, as well as being the university where Bengt Holmström completed his Bachelor’s degree. Ahlfors’ points are being awarded to the University of Helsinki from 2017 onwards. Previously they were mistakenly allocated to Harvard.

Research citations 20%:
The number of times other research refers to a publication indicates how significant it is. Citations for research publications from different universities are collected from the Clarivate Analytics (formerly Thomson Reuters) Web of Science and the Elsevier Scopus databases. The Shanghai ranking uses the Web of Science database. Research in languages other than English is underrepresented in these databases. Nevertheless, citations are the University of Helsinki’s strong suit in all of the rankings.

Highly cited researchers 20%:
The top 1% of researchers in each field are listed in the Clarivate Analytics list of highly cited researchers. The University of Helsinki had eight researchers on the latest list. The list of highly cited researchers is constantly in flux. If a researcher switches universities, the original university loses the points. The University of Helsinki’s highly cited researchers: https://www.helsinki.fi/en/research/highly-cited-reseachers

Number of publications in Nature and Science 20%:
Nature and Science are among the world’s most famous scientific journals and authorities in the natural and medical sciences.

All previous elements in relation to number of academic staff 10%:
The goal is to even out the impact of the size differences among the universities. This criterion also assumes that every teacher conducts research of the highest possible quality.

TEACHING AT THE SIDELINES

Even though the rankings are interesting to students, none of them offer comprehensive information on the teaching provided at each institution. A few years ago, the OECD tried to draft indicators for teaching quality, but the project fizzled out as teaching is very difficult to measure.

According to Javanainen, the biggest problems with the rankings have to do with evaluating teaching. As it is possible to objectively measure the number of research publications and their citations, the rankings tend to emphasise research at the expense of teaching.

 “Good research and teaching do correlate to some degree, so the rankings do also say something about the quality of the teaching. A university with good research is also likely to provide good teaching. However, individual teachers may find themselves in a position where conscientiously preparing their teaching means less time for research.”

A hardworking teacher with no time to publish research is nothing but dead weight in most rankings, as the research achievements are divided by the number of academic staff, but they are not adjusted to the number of students.

 

Times Higher Education (THE)

Published by Times Higher Education, a commercial magazine. Published in its current format since 2010.

The University of Helsinki’s latest ranking: 90th in 2017

Teaching 30%:
Includes a reputation survey in which researchers from different universities are asked which universities provide the best teaching in their field. The number of doctoral degrees and the university’s income are calculated in relation to the number of academic staff.

Research 30%:
Includes a reputation survey in which researchers from different universities are asked which universities produce the best research in their field. In addition, publications and the amount of external funding are calculated in relation to the number of academic staff.

Research citations 30%:
In 2015, Times Higher Education switched to using Scopus as its ranking database, which had a favourable impact on the University of Helsinki’s ranking. Scopus features somewhat more humanities and social sciences than the Web of Science database of Clarivate Analytics.

Research income from the private sector 2.5%:
Research income gained from the private sector in relation to the number of academic staff. Universities specialising in technology, economics and medicine tend to fare better in this area than a general university such as the University of Helsinki.

Number of international staff and students 7.5%:
The international outlook means diversity, new stimuli and influences. However, success in this criterion is heavily influenced by the university’s location and ability to attract international students through financial incentives. The University of Helsinki tends to rank poorly in this section.

COMMENSURATE MEASUREMENTS?

Other problems with the rankings are the lack of commensurate and reliable data. There are better indicators than the ones used today, but they would not provide data of an equal quality from all countries. For example, the scope of Bachelor’s and Master’s degrees varies so much that calculating the number of such degrees reveals nothing. Student feedback is problematic because of cultural differences and discrepancies in underlying information.

There is also a certain clandestine element to the rankings, as they are not based on open data and are not repeatable.

 “The ranking institutions try to keep their methodology and figures to themselves to stop others copying their ranking and to obscure any mistakes,” says Javanainen.

Wealthy American universities top the rankings year after year, but they are not very interested in the lists.

 “Rankings are particularly interesting among the mid-tier universities who are trying to boost their standing,” Javanainen muses.

The QS Ranking

Was previously a part of the Times Higher Education ranking, and split off in 2010. Published by the Quacquarelli Symonds company.

The University of Helsinki’s latest ranking: 102nd in 2017

Reputation 50%:
In the reputation survey, employers and researchers from other universities are asked about their opinions on various universities. Some universities have tried to manipulate the interviews by asking researchers to praise them. The reputation tends to accumulate on the more well-known universities, and it is difficult to overtake them.

Citations 20%:
The ranking uses the Scopus database for calculating citations. In 2015, the methodology and point allocation were changed in a way that was not beneficial for the University of Helsinki.

Ratio of students to teachers 20%:
Even though the University of Helsinki has more students than the American top universities, the University tends to fare well in this criterion.

Number of international staff and students 10%:
This is the University of Helsinki’s weakest suit in this ranking.

DIFFERENT FIELDS

In addition to the overall rankings of the universities, the ranking organisations publish field-specific lists. They say more about the university to prospective applicants than the placement in the overall ranking. However, field-specific results are not advertised as broadly as the general ranking.

 “It’s easier to announce a single number that reflects the whole organisation. That’s an easy topic for news articles that get clicks. But it’s a good idea to check the field-specific ranking as well,” advises Javanainen.

The field-specific rankings should not be compared to other fields in the same university, but between institutions. It’s more difficult to get to the top of a heavily researched field than a less studied one, just like it’s harder to win a world championship in football than it is in ice hockey.

What you measure is what you get, they say. Is this also true of university rankings?

 “If success in the ranking becomes an end in itself, it stops being a good indicator of success. This is known as Goodhart’s law, and it is also applicable to university rankings. We can’t let the rankings lead the University,” says Javanainen.

 

the Taiwan-ranking

Published by the National Taiwan University since 2007.

The University of Helsinki’s latest ranking: 81st in 2017

Research articles 25%:
The ranking separates the number of articles over the previous 11 years from the number of articles during the previous year.

Research citations 45%:
This criterion tracks the number of citations over the previous 11 years, the number of citations over the past two years, and the average number of citations over the past 11 years. Points are also awarded for the h-index, which measures citations.

Highly cited articles 15%:
Based on the InCites’ Essential Science Indicators database, which includes approximately 11,500 key journals from various fields of science. It highlights the most-cited 1% article for each year from the past 11 years

Articles published in influential journals 15%:
The impact of the journals has been determined with the InCites’ Journal Citation Reports tool. It relates the number of citations for articles published in each journal to the total number of published articles. Journals which are in the top 5% of their field are considered to have high impact.

This article was published in Finnish in the Yliopisto magazine in October 2017.

 

Question, don’t ignore

 “I normally say that we don’t really believe in the rankings, but it’s nice when we do well,” says Arto Mustajoki, professor emeritus of Russian language.

Mustajoki worked with rankings when he served as the vice-rector of the University of Helsinki and the dean of its Faculty of Arts. He was also involved in an EU project which evaluated the ranking lists and tried to develop better criteria for them.

Mustajoki says that university rankings should be viewed with healthy scepticism, but not ignored. Despite their problems, the current rankings have attained an established position.

 “We can’t wave these rankings aside and claim that they have no point. Internationally they have great significance. In Asia, a university’s ranking can determine whether anyone wants to work with them.”

Mustajoki’s favourite of the current rankings is the one released by U.S. News, despite the fact that it is less well-known in Finland, as it tries to consider the specific characteristics of each field. The U.S. News ranking also includes books, which are a common form of publication among humanities scholars. In the Shanghai ranking, for example, research in the humanities is invisible for all intents and purposes.

Mustajoki would like rankings to admit openly that they only compare the research achievements of universities. Teaching and social impact are absent, as making equal comparisons of these factors between different countries would be practically impossible.

 “It would be amazing if we could find a way to measure the respective competences of a Finnish and German Master’s degree holder. Such methods have been attempted, but with poor results.”

The level of research can be reflected on the quality of teaching, if top researchers manage to bring their students to the forefront of research. A high level of research is also a good draw for the best students.

Rankings are largely based on research visibility. That isn’t purely a bad thing. Reaching the broadest possible academic audience is also an objective worth pursuing from the perspective of the researcher’s career and the general development of scholarship.

 “If I think back on my career and about the things I would do differently, I would publish less but in better journals. I would also advertise my publications more.”