Home | News | Broken ladders? Why Rhodes snubs rankings

Broken ladders? Why Rhodes snubs rankings

Rhodes University in South Africa doesn’t throw its own hat in the rankings ring – but that doesn’t mean you can’t find it there.

Drawing on publicly available data, many rankings bodies slot its name into their league tables, possibly above or below competitors who actively engage with the lists.

So why do so many universities see value in rankings and yet others avoid them?

Explaining Rhodes’ aversion to rankings in an article for The Conversation, director of the centre for postgraduate studies at the university, Sioux McKenna, said the methodology underpinning university ranking systems “would be unlikely to pass as a third year student’s research project”.

“And yet high-status universities around the world spend time and money competing in this extravaganza rather than pointing out that the Emperor is wearing no clothes,” McKenna said.

Beyond reputation and prestige, and the associated marketing opportunities that come with rankings, there might also be stronger ties to international student revenue than once thought.

In 2018, consultancy firm Studymove held that international student fees at Australian universities had risen in tandem with institutions' improvement in QS rankings.

"It is not a surprise that there is a relationship but it is surprising how strong the correlation is," the authors wrote in their report.

They suggested that to international students, the rankings justify the fee hikes.

On top of this, rankings readily get media attention – each year, Campus Review publishes the results of the major rankings players and they sit among our most popular articles. But McKenna said she finds it worrying that the sector is “held captive by these glitzy spectacles”.

“Imagine if a student indicated that their research project would be to develop a ranking of all universities. They would allocate 20 per cent to whether current students and the general public thought the university was prestigious, 5 per cent for the number of Nobel Prize winners on the institution’s staff, 30 per cent for the number of research publications, and so on.

“Any academic advisor would throw the proposal out.

“Some of these criteria are subjective. The weightings are arbitrary, important aspects of many universities are missing and the averaging of unrelated aspects to a final number is simply poor science which does not tell us much about the institution at all.

“And yet this is exactly how rankings are determined.”

McKenna isn’t the only South African academic concerned with the role rankings play in the higher education sphere.

In an earlier Conversation piece, Ahmed Essop, a research associate in higher education policy and planning at the University of Johannesburg, cautioned the country’s universities against making rankings their raison d’etre.

“They should focus on building a quality higher education system that is responsive to the challenges that face South Africa in the 21st century,” Essop said. “This requires a diverse and differentiated higher education system based on institutional collaboration rather than the market-driven competition that results from participation in global rankings.”

And the South African pair have allies in this stance in Singapore.

An article published early last year in Singaporean newspaper Today revealed how several academics in non-STEM fields at the National University of Singapore (NUS) and Nanyang Technological University (NTU) had left or were intending to leave their universities because of the “incessant pursuit of rankings and the relative lack of academic freedom when it comes to certain projects or research initiatives”.

NUS was this year ranked 25th in the world by Times Higher Education (THE) and 11th by QS. It slid from 67th to 80th on the ShanghaiRanking table. NTU was Singapore’s silver medallist on all tables, coming in at 47 per THE, 13 according to QS and 91 per ShanghaiRanking.

Quoted in the Today article, political scientist Woo Jun Jie, who left NTU in December 2018, said: “The real problem is a systemic overemphasis on research outputs over other forms of innovative academic activities, be they pedagogical innovation or community service.”

The article was soon removed by the publisher. In a statement, an NUS spokesperson said the article did not adequately represent the university's position on the matter and that it was seeking legal advice.

"Ranking is not a driver of change at NUS,” the statement read.

Opponents of universities’ heavy focus on rankings also have friends in the UK. A 2017 report from the Higher Education Policy Institute (HEPI) cautioned governments, universities and students against using international rankings to make decisions about higher education.

In International university rankings: For good or ill, HEPI argued that the data used to compile the three major world university rankings – THE, QS and ShanghaiRanking – was unreliable.

“Universities supply their own data and the compilers of the data accept the data as supplied,” wrote report author and HEPI president Bahram Bekhradnia. “Worse, there is no effective attempt by the compilers of the rankings to audit or assure the quality of the data.”

Among the potential benefits of rankings, Bekhradnia listed in his report that they provide information to potential students, policymakers and other stakeholders and offer benchmarking information for universities themselves.

“[Rankings] are a stimulus to improvement both as a result of such benchmark information and because of the incentives to improvement that competition provides.

“For example, Jan Konvalinka, ViceRector of Charles University, Prague, has said that the availability of international comparisons might enable him to shake his staff out of their complacency by showing that, although they are first in their own country, they are way down in international terms – a sentiment often repeated by institutional leaders.”

Still, Bekhradnia said the only way universities improve their standing on league tables is by improving research performance, driving them to focus on widening participation and outreach at the expense of teaching. He added governments, including in France, Germany, Russia and China, have changed their policies explicitly in response to the performance of their universities in international rankings.

“One important – perhaps the most important – function of universities is to develop the human capital of a country, and enable individuals to achieve their potential," he said.

"Most institutions should be focusing on their students, and a ranking scheme that takes no account of that cannot, as all do, claim to identify the ‘best’ universities. In their anxiety to rise up the international rankings, universities have prioritised research over other activities.”

McKenna said many universities in South Africa are now chasing rankings, while others, Rhodes among them, don’t have a choice but to be included.

Rhodes sits in the 800-1000 band in the QS World University Rankings 2021 and in the 901-1000 band in this year’s ShanghaiRanking’s Academic Ranking of World Universities. It doesn't appear on the 2021 THE table.

“Despite having among the highest undergraduate success rates and publication rates in South Africa, the lack of medicine and engineering programmes works against it,” McKenna said. “So too does its strong focus on community engagement and its small size – though these might be exactly why the university is a good fit for many.

“The dodgy methodology is stacked against the institution but far more problematically, it’s stacked against most of the purposes set for higher education in South Africa’s white paper of 1997.

“Nowhere do these systems concern themselves with transformation, social justice or the public good.”

And, if a presentation at the 2016 Australian International Education Conference is to go by, at least one previous member of the THE team agrees that rankings don’t take all the strengths of a university into account.

Duncan Ross, data and analytics director at TES Global, the company that owned THE magazine at the time, said: “If the primary purpose of your university is teaching, you probably will never end up in any of these lists.

"It doesn't make you a worse university, it just makes you a different university. There is a good reason why international university rankings look at research. The answer is simply [because research data] is more easily obtained across international boundaries. Citations are international by nature. We can measure those relatively easily.”

Ross added that if THE stopped publishing rankings the following day, another player would readily take its place.

“Once rankings are out there, there's not much you can do about it in terms of turning the clock back. What you can do, however, is make sure that rankings are as accurate and as useful as possible.”

Some three years later and THE released a set of new performance tables that assessed universities against the United Nations’ Sustainable Development Goals (SDGs).

THE said the new table was the world’s first global attempt to document evidence of higher education impact and represented "a radical new way of looking at university excellence" that went beyond the focus of traditional rankings.

Phil Baty, chief knowledge officer at THE, said the work was more than just a ranking. “It is the start of an initiative that we will continue to develop in consultation with universities, academics and sector groups. The need and desire for clear, new metrics on impact is strong and, while this is a highly challenging area of data collection, it’s also an important and necessary step forward.”

Do you have an idea for a story?
Email [email protected]

Get the news delivered straight to your inbox

Receive the top stories in our weekly newsletter Sign up now

Leave a Comment

Your email address will not be published. Required fields are marked *

*

To continue onto Campus Review, please select your institution.