Tag Archives: university rankings

Ireland’s performance in QS World University Ranking, 2010 – 2015

So the QS World University Ranking 2015 is out, and for what it’s worth this is what it looks like for Ireland’s HEIs from 2010 to the present. The QS methodology is heavy on the reputation surveys, clocking in at 50% of the total, over which the institutions have very little influence themselves – and government, the HEA, the IRC, or whoever else has even less. The remaining 50% is down to faculty/student ratio (20%), international staff and student ratio (5% and 5%), and finally citations per faculty at 20%.

Citations is probably the one thing that institutions might be able to do something about more readily than other areas (i.e., it doesn’t require new hires) but note that the bibliometric indicators and databases that are used to calculate such scores are biased against some areas in which Ireland does well (humanities).

QS 2010 - 2015

The University Times in Trinity has some brief coverage, noting the fall over time. What’s probably worth saying here is that in this case, as with the Time Higher Education World University Ranking (previous post), because rankings are zero sum, even if an Irish HEI’s performance stayed the same or even improved, that would not necessarily be sufficient to maintain or improve ranking. What’s key is to improve faster than everybody else. It’s the archetypal Red Queen scenario as per Through the Looking Glass:

“Well, in our country,” said Alice, still panting a little, “you’d generally get to somewhere else—if you run very fast for a long time, as we’ve been doing.”

“A slow sort of country!” said the Queen. “Now, here, you see, it takes all the running you can do, to keep in the same place. If you want to get somewhere else, you must run at least twice as fast as that!”

It’s not enough simply to change or improve, because this is only sufficient for survival. So how do Irish HEIs compete in a world like that?

[Note: Maynooth University is still labelled NUIM as I am slow to change, so apologies to Kildare-based colleagues. For DIT, Maynooth, and UL, their positions reflect the upper value in the bandings which they sit.] 

Advertisements

“The obsession with rankings”, presentation to the World Bank, 28/1/15

Presentation given by Prof. Ellen Hazelkorn to the World Bank in Washington, D.C., last week on the 28/1/15: The obsession with rankings in tertiary education – World Bank Presentation. Overview of the most topical issues regarding

  1. Putting rankings into context.
  2. Ranking higher education – advantages and disadvantages.
  3. Do rankings measure what counts?
  4. What does the research tell us?
  5. Implications for policy.
  6. Do’s and don’t’s, specific actions, and alternatives to rankings…

Policy Brief: “Rankings and Quality Assurance: Do Rankings Measure Quality?”

CHEAmediumhorizontalPolicy Brief by Prof. Ellen Hazelkorn on Rankings and Quality Assurance. Produced for the Council for Higher Education Accreditation (CHEA) International Quality Group, Washington D.C.: CIQG Policy Brief No. 4, January 2015

  • The growing influence of academic rankings.
  • What are rankings?
  • Rankings and Quality assurance.
  • What’s next?

Follow CHEA on Twitter: @ciqgnews.

Rankings in Institutional Strategies and Processes project report launched.

Screen Shot 2014-11-14 at 14.24.30The RISP report was launched in Brussels at the European Universities Association on Thursday 6th November, 2014. From the University World News article by Ellen Hazelkorn: “Rankings in Institutional Strategies and Processes, RISP, is the first pan-European study of the impact and influence of rankings on European higher education. Co-funded by the European Commission and led by the European University Association, the project provides an insight into how rankings impact and influence European universities’ behaviour and decision-making.”

The full text of the report is available for download now from Dublin Institute of Technology’s Arrow repository now at this link.

Global University Rankings and national sovereignty

From HEPRU’s Prof. Ellen Hazelkorn on rankings and how they cannot capture the full complexity of higher education:

There are about 18,000 university-level institutions, according to the International Association of Universities. For Ireland, being in the top 500-600 represents the top 3 per cent of all universities worldwide. If we believed in rankings this is something of which we should be immensely proud.

It has been estimated that the annual budget of a top-100 university is about €1.7 billion. This is would consume almost all the annual budget for Irish higher education, which is about €2 billion. To pursue a national strategy based on rankings would require diverting the entire budget to a single university on an ongoing basis – because one injection of funding would not be sufficient.

But money isn’t the only issue at stake. It’s not evident that the indicators used by rankings measure what is meaningful. Thus, to shape our national higher education policy and priorities according to indicators chosen by (commercial) ranking organisations would constitute the abandonment of national sovereignty. Why should we do this?

Full article on the Irish Times website: “Devil is in the detail of global university rankings.”

Some points on Ireland’s Times Higher Education World University Ranking ‘slide’

Andrew Gibson

First some facts, then a tiny bit of analysis. We’ll make this quick. We read that Trinity College Dublin has slipped in the ranking from 129th position to 138th. From the data that THE makes available, however, the overall score for TCD, as provided by THE,  has improved, from 50.3 to 51.2:

Screen Shot 2014-10-02 at 09.16.53Screen Shot 2014-10-02 at 09.17.43This is significant, especially when we consider that the overall score is not simply the aggregate of the five variables listed above. Alex Usher of Higher Education Strategy Associates notes in a recent post: “But this data isn’t entirely transparent. THE […] hides the actual reputational survey results for teaching and research by combining each of them with some other indicators (THE has 13 indicators, but it only shows 5 composite scores).” This is especially significant when we consider what has happened to UCD in the THE ranking. We go from last year’s result, when it was in the top 200:

Screen Shot 2014-10-02 at 09.31.02

 

 

 

 

 

 

 

 

 

To its latest rank, this year:

Screen Shot 2014-10-02 at 09.30.47

Notice anything? The overall score is withheld. Sure, there are clear differences in the individual indicators, but what do these mean? Did UCD’s researchers really publish 15.3(%? Notches? Magic beans?) less this year (Citations)? The difference in Research is 0.9, so the “volume, income, and reputation” seems to be more or less intact. Teaching has actually improved by 4.6. At best, however, the overall ‘improvement’ in score by TCD could indicate (charitable interpretation of the ranking) that other universities are also improving, but that they have improved quicker. This echoes the truth about life among predators in the wild that you don’t necessarily need to be the fastest to survive a predator – you just need your neighbour to be slower than you.

An Irish Times article goes on about the above, saying that “main impact of the cuts has been on the student-staff ratio, which is one of the major factors used by all the ranking agencies.” Which is true. But the OECD in its recent Education at a Glance report notes that staff-student ratio is not an indicator of teaching quality, nor teaching outputs, nor results. It’s an indicator which has been jumped on because it is an intuition pump, in that it “elicits intuitive but incorrect answers.” There is as much evidence saying that large classes can lead to better learning outcomes as suggests the opposite.

One may then be inclined to agree with Prof. Andrew Deeks of UCD when he says “Our own analyses show that in terms of objective measures of teaching and research performance, we are performing well and making good progress.” The call to reverse cuts, in the hope that this will magically lead to an improved performance in rankings is a political argument. And that’s fine. But beware of rankings bearing ill-tidings. Rankings measure what they measure, rather than measuring the objective reality of higher education – and what they claim to measure may be questionable in and of itself.