A critical analysis of the QS World University rankings reveals the many flaws and blatant inconsistencies in the most popular university ranking system of the world
A plethora of rankings exist out there that tell us where IIT Bombay stands amongst the universities of the world. One of the most popular by Quacquarelli Symonds (QS), the QS World University Rankings 2013, recently ranked IITB to be the 233rd best University in the world overall and 56th best in Engineering and Technology. On their Indian counterparts, we rank 3rd in the country according to The Outlook’s rankings while we haven’t appeared in India Today’s rankings for the past 3 years.
The IITs failing to make the top 200 in the QS World University Rankings made the headlines yet again, with media statements renouncing faith in the Indian Education system. Amidst this, at a recent IIT Council meeting it was resolved to pay more attention to the rankings and concentrate on branding and marketing the IITs. We, at InsIghT however, took a step back to critically analyse if the rankings do really give us a true picture to judge ourselves by. We have come to realize that we have just opened a Pandora’s box.
[pullquote]How relevant is the proportion of international students to an institute, which is primarily funded by taxpayers with an expectation to serve national interest?[/pullquote]
What the Rankings look at
Different agencies use different parameters to arrive at their respective rankings. For our analysis in this article, we have just taken the most popular of them all, the QS World University Rankings. The various parameters that QS looks at and the respective weightages are as shown in the pie chart.
The first question one could ask is how good these parameters are at estimating the worth of a university. For instance, how relevant is the proportion of international students to an institute, which is primarily funded by taxpayers with an expectation to serve national interest? Secondly, the relative weightage assigned to the various parameters is highly vulnerable to criticism. It wouldn’t to very wrong to assume that employability is the most important factor for most undergraduates. A point that was raised repeatedly in our discussions pertained to the disproportionate weightage of 20% to Student-Faculty ratio as opposed to a meagre 10% to Employer reputation, substantiating our assumption.
Different people and universities have widely different priorities and a one-size-fits-all solution is hardly appropriate. A ranking based on a priority list that is not our own may not be so relevant to us. However, in order to standardize, there seems to be no way out but to stick to some set of parameters and gulp down the bias of the rating agency that comes with it. Just to illustrate, It so happens that IIT Bombay’s world rank shoots up from 233 to 65, much ahead of all other Indian Institutes, when just Employer Reputation is considered and it goes off the ranking list when considering just the Student-Faculty Ratio. But, before we begin to sound like the fox with the sour grapes, let’s move on and assume for now that parameters and the weights used by QS are agreeable.
Reliable Data or Utter mess?
A recent news report carried by the NDTV suggested that IITs didn’t make it to the top 200 because they refused to pay a huge sum of money (in lakhs of dollars) to the ranking agency for participation and that the rankings suffered because outdated publicly available data was used instead. Although no ranking agency was named, the reference to the top 200 seemed to strongly suggest QS. Intrigued by this, we investigated further on this and found the news report to be untrue. Neither is the institute charged by QS for participating in the rankings, nor is the data used by QS outdated publicly-available data.
[pullquote]It so happens that IIT Bombay’s world rank shoots up from 233 to 65, much ahead of all other Indian Institutes, when just Employer Reputation is considered and it goes off the ranking list when considering just the Student-Faculty Ratio.[/pullquote]
QS obtains its data from three sources – The data for Academic Reputation and Employer Reputation is sourced from its own surveys conducted across universities and employers; Citation data is sourced from Scopus and the rest is sourced directly from the Universities themselves.Among the four parameters that contribute to 90% in the rankings, Academic Reputation and Employer Reputation seem a complete black box except for the fact that, on a first glance, there seems to be a significant bias towards Western representation in the surveys conducted.. Since, we don’t really know much about the exact survey methodology, we will refrain from commenting any further on these two parameters.
The data for the calculation of Student-Faculty Ratio (SFR), that has 20% weightage, is sourced directly from the respective institutes. With our current SFR of 15.3, we are far away from the ideal 10 and thus suffer in the rankings hugely on that count. According to QS, “Student Faculty Ratio is, at present, the only globally comparable and available indicator that has been identified to addressme.co.nz/ball-dresses.html”>dressme.co.nz/ball-dressme.co.nz/ball-dresses.html”>dresses.html”>dressme.co.nz/ball-dresses.html”>dress the stated objective of evaluating teaching quality.”
Whether SFR really is the best indicator of teaching quality is up for debate, but even assuming so, we noticed potential problems with this. Even with the guidelines given by QS, the definitions of student and faculty are extremely fuzzy and are prone to possible opportunistic interpretations by the respective institutes.
[pullquote] It is common practice for some Universities to include Post-Docs or Research Staff in the faculty count, resulting in better student-faculty ratio” says Prof. Devang Khakkar, our Director.[/pullquote]
To validate our guess, we checked the SFRs of NUS, IITD and IITB. According to QS, the normalised score for SFR are 81.4, 35.4 and 28.9 respectively.*. We also obtained the exact figures of faculty and student counts from official sources + and arrived at figures of 17, 18 and 15 respectively. QS’s normalized scores and the actual raw data show no correlation whtsoever. Although QS data suggests otherwise, we actually are better off than both NUS and IITD in Student-Faculty ratio. “It is common practice for some Universities to include Post-Docs or Research Staff in the faculty count, resulting in better student-faculty ratio” says Prof. Devang Khakkar, our Director. We rank significantly better than IIT Delhi in terms of both Academic Reputation and Employer Reputation. SFR is one parameter where IITD has an edge over us in the final rankings, which as it turns out is not entirely accurate.
Citations per Faculty
‘Citations per faculty’ is another parameter where we lose out significantly to IIT Delhi. In fact, we rank lower than all of the other old-five IITs and IIT Roorkee. When asked about the surprisingly low figures, the Director pointed out that the QS figures hugely under-represented our actual figures. Since QS fetches the data independently through the Scopus database, the anomaly was rather puzzling.
A visit to the Dean R&D office offers a possible explanation. All papers, on submission, need to specify an affiliated institute where the research was undertaken. IIT Bombay does not enforce a standard convention for stating the name (Which is enforced by some of the other IITs) which would render the variants of ‘IIT Bombay’ – such as I.I.T. Bombay, IIT Powai and IITB as different institutes in a Scopus search. The Dean R&D office fetches all our citation data from Scopus using a query with 18 different variations of the name string ‘IIT Bombay’, while the same cannot be expected from QS. If actually true, this could have put our figures in serious jeopardy.
[pullquote]We rank significantly better than IIT Delhi in terms of both Academic Reputation and Employer Reputation. SFR is one parameter where IITD has an edge over us in the final rankings, which as it turns out is not entirely accurate.[/pullquote]
While we have analysed only the QS Rankings, the other rankings are likely to be very similar or, going by the popular sentiment on the Internet, much worse. Does this mean that they are absolutely useless and should be completely ignored? Perhaps not. In the absence of any other alternatives, these rankings are indeed the best avaliable indicators of global standing of universties, especially when one is looking at applications for Higher Education and there is no other way to compare universities. However, in order to judge ourselves, these rankings should perhaps be taken with, not a pinch, but a handful of salt.
* This is the data for the year 2012; We were unable to obtain all the data for 2013. Scores are normalized out of 100. We also do not know anything about the normalization function except that it should be negatively correlated with SFR.
+ IIT Council website and PRO’s Office
Improve our rankings – at what cost?
Very recently, just as we were getting ready to publish this article, we came across an article in the Indian Express titled Rank Inconsistency written by Prof. Gautam Barua, the former director of IIT Guwahati, where he expressed his reservations against using the rankings as a measure of our performance. With tongue firmly in cheek, he says “If we want Indian institutions to get appreciably higher QS and THE rankings, we must allow the institutions to do the following:
- Spend heavily to aggressively market the institute among academia and corporations in the US and Europe;
- Substantially increase the number of foreign students. The government must allow undergraduate admissions, allow assistantships for foreigners and remove ceilings on incomes for foreign faculty;
- Hire a large number of temporary “teachers” to boost the faculty-student ratio (which counts the number of “academic staff”, and which apparently is done by many US universities); and
- Create a network among Indian institutions to encourage citations of papers of other Indian institutions, that is, scratch each others’ backs.”
None of these would perhaps be of any benefit to us except to improve our rankings. Is it still worth spending time, energy and money on them?
Rankings in the IIT Council meeting
The major issue discussed in the recent IIT Council meeting is the global standing of IITs. A press release stated that although the undergraduate engineering programmes of the IITs are some of the best ones offered globally, there is scope for improvement on composite indicator rankings.The IIT council seems to be taking an active interest in understanding how we are ranked so as to work on the areas where we lack. A Committee of IIT Directors is looking into the issue and are already in touch with the QS ranking officials to understand the methodology of the ranking agencies and systems, as per the press release.
The Gossip Section
IIT Bombay not being featured in India Today’s rankings puzzled many a people. In June 2012, when India Today published their rankings, there was a footnote – IIT-Bombay is not featured in this ranking as it did not share the factual data on time. The then-PRO Ms. Jaya Joshi registered a strong objection to the statement in a letter to the Editor of India Today. The letter stated that IIT Bombay was never approached by any representative regarding the data for 2011 or 2012 rankings and the last time IIT Bombay was contacted was in 2010 when they were provided the required details. The Public Relations Office says they never received a reply to the letter and IIT Bombay hasn’t participated in India Today’s rankings since.
References and Credits
Special thanks are due to Ms. Madhuri Wankhede at the PRO office for all the hours of digging through emails and documents to get all the required data..
Statistics of IITs – Website of the Council of Indian Institutes of Technology – www.iitsystem.ac.in
Statistics of NUS – www.nus.edu.sg/about-nus/overview/corporate-information
Quacquarelli Symonds (QS) – www.iu.qs.com/university-rankings/world-university-rankings/
QS World Universities Rankings – www.topuniversities.com/university-rankings/world-university-rankings/2013
Updates and Edits
September 2014: Reworded introduction to Prof. Gautum Barua’s comments to indicate more clearly that his comments were meant not to be taken seriously, but only to point out how little it would tangibly impact the IITs to direct their energies towards improving their rankings.