The University of Toronto received a ranking of 17th overall from the Times Higher Education Rankings, and a ninth overall from the Higher Education Evaluation and Accreditation Council of Taiwan. Both rankings place U of T as the top Canadian university.

“Any institution that makes it into this table is truly world-class,” said Ann Mroz, editor of the Times. A plethora of American and English universities dominate the rankings, joined by three Canadian universities: U of T, McGill, and UBC.

These rankings come shortly after McGill University was the only school in the top 25 of the Quacquarelli Symonds University Rankings. U of T is not particularly fussed by this disparity.

“This year, QS is the only big international ranking where McGill outranked U of T,” said U of T President David Naylor in an interview. “The survey data from QS clearly show that U of T has a stronger reputation among academics globally than any other Canadian university,”

Naylor attributed the QS rankings to both McGill having more international students than U of T because it has drawn from the eastern seaboard of the US, and the fact that U of T, unlike McGill, adjusts downwards for part-time faculty. “That gives them a much higher score on student-faculty ratio, and in itself accounts for the difference in our QS rankings.

“Response biases in surveys are another problem,” added Naylor, who explained it is particularly difficult for smaller schools to receive attention in these ranking schemes. “Most systems under-recognize amazing small undergraduate universities, such as one finds on the east coast of Canada and the US.”

Naylor attributes missed opportunities from abroad as another impact on rankings. “One thing that hurts us is […] that Ontario does not provide per-student grants to international graduate students. B.C. does it, and that really helps UBC in recruiting.”

Naylor acknowledged that additional discrepancies exist as a result of methodology. “Every ranking system has biases as well as sources of imprecision […] Quirks illustrate why no one should take any individual ranking too seriously or get too excited about year-to-year variation. Small differences in methods of counting different items make big differences in how a school does.”

Naylor adds that there also seems to be a lack of a standard to normalize data into information, causing further discrepancies. “Some progress is being made by adjusting for citations of papers rather than numbers of papers, as well as adjusting for the size of the university in relation to the research outputs. But even then, there’s disagreement. Do we adjust for numbers of faculty alone, or should one also adjust for the numbers of graduate students and post-doctoral fellows?

“But in the final analysis, no single rank or score can ever reflect a big complex place like a university with all its strengths and weaknesses.”


Full interview with David Naylor

The Varsity: Why do you think McGill University beat the University of Toronto in this year’s QS report and previous year’s old THES-QS report?

David Naylor: This year, QS is the only big international ranking where McGill outranked U of T. On the HEEACT rankings from Taipei, the ARWU from Shanghai, and now the new THES rankings, U of T stands first in Canada. The reason for McGill’s strong showing in the QS relates to two factors. A minor factor is the weighting of the number of international students. McGill has more international students than U of T, in part because it has drawn for years from the eastern seaboard of the US. The biggest factor by far, however, is how McGill counts faculty for the QS rankings. U of T and some other Canadian schools such as UBC adjust downwards for part-time faculty. For whatever reason, McGill does not. That gives them a much higher score on student-faculty ratio, and in itself accounts for the difference in our QS rankings.

If you check our website, you’ll see that we became aware of this difference and its impact a couple of years ago. We have worked with University of British Columbia and other universities, including University of Alberta, to make these counts as accurate as possible. That’s the right thing to do.

The Varsity: What can U of T do to overtake McGill?

David Naylor: In the QS rankings, nothing – unless QS tells them to adjust their faculty counts properly. The THES group insisted on adjusting faculty counts this year and that, along with some other improvements in methods, helped U of T’s score. In fairness, McGill probably has its own concerns about the THES methods this year. These quirks illustrate why no one should take any individual ranking too seriously or get too excited about year to year variation. Small differences in methods of counting different items make big differences in how a school does. That may also explain why, when you look at the University of Alberta or some of the other research-intensive universities in Canada, their THES ranks this year are much lower than in other years or in other systems.

The Varsity: Another model that has been historically mentioned is that of national-international elite universities. Many use the analog that Yale and Oxford are akin to domestic elite universities and Harvard and Cambridge are international elite universities (for instance Yale is known for educating many American leaders, e.g. George Bush, but Harvard is known for educating many foreign leaders, e.g. Felipe Calderon). Furthermore, historically, prior to the Quebecois Separatist movement, many would argue McGill and U of T also followed the same national and international roles for Canada. Do you agree with these classifications and statements, and why?

David Naylor: All these generalizations are fun, but they don’t hold much water. Oxford is every bit as international as Cambridge. The Yale-Harvard distinction is a bit more persuasive, but Yale is way more international than many other famous universities in the US and globally.

The Varsity: Some argue that today, U of T is merely known as Toronto’s city school. Do you see an opportunity for UofT to gain in brand reputation as the Towards 2030 reports and frameworks show evidence to making U of T a national elite university as well, e.g. recruiting the best and brightest from across Canada?

David Naylor: The survey data from QS clearly show that U of T has a stronger reputation among academics globally than any other Canadian university. Our international student recruitment is also growing steadily. However, as we’ve said in the Towards 2030 reports, U of T needs to do more pan-Canadian recruiting and keep working on our public profile in other countries. One thing that hurts us is the fact that Ontario does not provide per-student grants to international graduate students. BC does it, and that really helps UBC in recruiting international graduate students.

The Varsity: Do you believe these rankings are biased, based on their methodology? Some common biases have included: Biased towards researching, not teaching; biased towards quantity of research, not quality of research; biased towards larger schools, not smaller schools; biased towards English and American universities; biased towards the social sciences, not the natural sciences, or even vice versa. Do you think these are pertinent?

David Naylor: Every ranking system has biases as well as sources of imprecision. Most systems under-recognize amazing small undergraduate universities, such as one finds on the east coast of Canada and the US. As to quantity versus quality of research, some progress is being made there by adjusting for citations of papers rather than numbers of papers, as well as adjusting for the size of the university in relation to the research outputs. But even then, there’s disagreement. Do we adjust for numbers of faculty alone, or should one also adjust for the numbers of graduate students and post-doctoral fellows? Then we’re into the combination of imprecision and bias, because some systems like QS don’t standardize the counting of faculty! The humanities are almost always underweighted because it’s much harder to count books and book chapters. Metrics of commercialization or industry funding are far easier to count than the social impact of big ideas from social scientists and humanists or business school professors. Response biases in surveys are another problem. And everyone can argue about how the different items get weighted. The Times effort this year is the best I’ve seen so far in terms of an international multi-dimensional survey. But in the final analysis, no single rank or score can ever reflect a big complex place like a university with all its strengths and weaknesses.

The Varsity: Thank you very much.