Cambridge is the best university in the world, but Oxford is the best university in the UK. Bear in mind, however, that University College London is better than Oxford. Confused? Welcome to the world of university rankings.
The past fortnight has seen the publication of two worldwide university rankings that offer conflicting analyses of the state of higher education in the UK.
The first, published last week and compiled by QS, painted a rosy picture of British universities. Four of the top ten were British (with UCL above Oxford) and 19 of the world’s top 100 universities were from the UK. To top it all off, Cambridge knocked Harvard off the top spot.
One week later, and British universities were no longer feeling so smug. In the THES rankings, just five universities made the top 50 and only 14 were in the top 100. To compound the misery, Harvard was once more ensconced at number one.
To confuse matters further, both these rankings conflicted with the national university rankings. The Times and Guardian university rankings agree that Oxford is the best in the UK, even though it ranks behind Cambridge in both world rankings and behind UCL in QS’s.
According to the Times, Durham is one place better than UCL. But according to the THES rankings, UCL is better than Durham — by 88 places.
The reason behind these skewed results is simple: all the rankings use vastly different criteria. QS uses a survey of academics, the number of citations, graduate employment rates, student-faculty ratios and the number of international students to build its rankings.
Such an approach has been heavily criticised, not least by our own David Blanchflower:
Almost a third of the score is based on the student-to-faculty ratio and the proportion of both international faculty and overseas students, which is laughable as they tell us zero about quality. Other questionable measures that are used underweight the importance of current scholarship. This is an index that penalises the best to help the mediocre. We should judge our universities on the quality and quantity of the research that they produce. Period.
He’s scathing about the results of the survey, too:
The UK is not home to four of the top ten universities in the world, sorry.
Blanchflower favours the THES’s new approach, which relies heavily on citations. While citations are certainly indicative of research quality, research quality does not necessarily indicate a good university — at least not from the student’s view.
Having a world-class professor in your department does not necessarily equate to a world-class education. Being able to write a good book is no indication of whether or not a professor can give an excellent lecture or competently run a seminar.
It’s for this reason that the Times’‘s ranking takes the National Student Survey (NSS) into account. The NSS asks students how satisfied they are with their education. If a student is satisfied, the thinking goes, then they must have received a good education. Thus the university is deserving of a higher ranking.
But students at different universities have vastly different expectations. Those near the bottom of the satisfaction league — such as the London School of Economics and Manchester — are often at the top of overall rankings. Plus, students know that by criticising their university in the NSS, they are affecting its ranking and thus the reputation of their own degree. Professors have been known to pressurise students into giving good feedback for this very reason.
So, which ranking is best? Well, none of them. Each of them gives a broad idea of a university’s strengths or weaknesses, but should be taken with a wheelbarrow of salt. Publishing “woe is me” articles because only eight universities made it into the top 50 is merely a way of ignoring the broader issues for higher education in Britain today.
If people stopped talking about rankings and concentrated instead on coming up with a viable funding model, our universities would improve massively — and the rankings would take care of themselves.
Duncan Robinson also blogs here.