If you’re thinking of attending the University of Toronto, or any Canadian university, beware of the 2021 QS World University Rankings, published by QS Quacquarelli Symonds (QS). In this year’s rankings, U of T earned 26th place in the world and first place among its Canadian peers. On one hand, these rankings are a reason for growing pride for the university; U of T President Meric Gertler recently described them as marks of the university’s “continued excellence.” On the other hand, a quick glance at the rankings’ methodology raises a number of questions.
There are six elements that are taken into account: academic reputation, weighted at 40 per cent of a university’s final score; employer reputation, weighted at 10 per cent; faculty to student ratio, weighted at 20 per cent; citations per faculty member, weighted at 20 per cent; and the ratio of international to domestic students and faculty, weighted at 5 per cent each. While arguments can be made for individual consideration of each of these factors, together — at their current weightages — they present a distorted and even misleading picture of the educational quality at Canadian universities.
Firstly, reputation — worth 50 per cent of U of T’s score — is a fuzzy, highly subjective measure of institutional quality. It is also unclear what, if anything, an institution’s reputation really means for students, particularly for undergraduates. In a 2014 survey of chief information officers hiring new IT employees at 270 Canadian companies, only 10 per cent of the officers placed significant weightage on a candidate’s school. Troublingly, research suggests that employers only paid significant attention to a candidate’s school if their last name was perceived to have non-English roots.
Moreover, in contrast to the stratification of postsecondary schools in the United States and elsewhere — where, not incidentally, many of these rankings have their origins — there is no clear hierarchy of prestige among Canadian universities. There exist, of course, prestigious programs within certain universities — such as health sciences at McMaster University or engineering science at U of T — but the QS system has produced a meaningless analysis in Canada by so heavily weighing “reputation.”
Citations per faculty are another curious and deceptive measure of educational quality. Even if we assume citation figures to be an appropriate metric of outstanding research, there are many university educators in Canada primarily engaged in teaching who publish seldomly, or never.
This is particularly true for smaller universities like Acadia University or Mount Allison University, which focus primarily on undergraduate education. As a result, the QS ranking methodology unfairly privileges large institutions with extensive research activity — like U of T — while punishing smaller, undergraduate-focused universities.
The 10 per cent combined weighting of the international to domestic students and faculty ratio also may disadvantage smaller Canadian universities. To be fair, this weighting likely stems from the often valid assumption that many of the world’s best universities attract a large percentage of their students and faculty from abroad. In Canada, though, when international students and faculty choose a university, they must consider a wide range of factors with little direct bearing on questions of academic excellence. These include language of instruction, cost of living, access to employment, climate and average temperature, support for recent arrivals, and proximity to other members of their homelandʼs community.
On many of these counts, U of T excels largely due to its location at the heart of greater Toronto and southern Ontario. In contrast, Sudbury’s Laurentian University — with outstanding programs in earth sciences and human rights, but in a wintry and isolated location — attracts markedly fewer international students and faculty.
Those who trust the QS rankings — including prospective undergraduate students — are often unaware of these biases, which actively distort the strengths and weaknesses of Canadian universities. So if the QS rankings are clearly a poor tool for comparing educational quality across Canadian universities, what might help?
I would encourage prospective students of all kinds to search for three kinds of information.
Firstly, look for data on the amount of money a university invests per student in a program. If necessary, consider appropriate proxy information like the size per student of a program’s endowment. These figures will provide the clarity of knowing how much each program will spend — or has to spend — on your university education.
Secondly, search for data that speaks to alumni satisfaction with your program. For instance, as a part of its rankings of colleges and universities, the U.S. News & World Report tracks the percentage of alumni who donate annually to their alma mater. Similar or equivalent data will offer a global view of the extent to which graduates truly appreciate their education.
Thirdly, seek out data that describes the employment outcomes for recent graduates of your chosen program. This data will likely have little or nothing to do with the university’s score on the QS ranking. Instead, it’ll reflect the fundamental utility of the skills students gain over the course of their studies.
At a time of rapid change in our economy and in education, this is a useful reminder that, for all of the fretting over rankings and prestige, what remains the most important about a university is what we learn while we’re there.
Michael McCulloch is a third-year student in the Lassonde Mineral Engineering program.