May 1, 2019

Probably Baseless Ranking Frenzy

Yesterday, the government released results from the 2018 PBRF round.  PBRF is designed to allocate research funding to tertiary institutions.  The idea is to do this in a way that more money goes to institutions that do more research, both so that the status quo is upheld in the short term, and so that institutions such as AUT can make progress and get a larger share in the medium term. There’s also a goal of providing coarsely-grained feedback to institutions on where their research has been most successful. You could disagree with the goals and argue that all funding should be based on competitive grant proposals, or that there should be more research funding given to institutions that involve under-represented groups in research, or that people who don’t get good research scores should be fired, or whatever, but I think PBRF does reasonably well at achieving the goals it has.

An unfortunate side-effect of PBRF is that the government produces a fairly large set of numbers about the tertiary institutions.  Given a large set of numbers about a small number of institutions, there’s a very high chance that you can find a somewhat sensible summary of those numbers that puts you on top.  This is a game everyone can play. And does.

You don’t even need to go past the TEC information release. As Radio NZ says

The report said Victoria University had the highest score (29.19) for the number and quality of active researchers on its teaching and research staff, followed by Otago (26.09) and Canterbury (25.92) universities.

However, Lincoln University had the highest score for active researchers compared to the number of students enrolled in courses at degree level and higher (2.14), followed by Otago (2.06), and Catholic theological institute the Good Shepherd College (2.00).

The results showed the largest university, Auckland, had 1744 FTE staff funded by the PBRF and 391 or 22 percent were classified as A grade or world-class, the highest proportion of any university.

At Victoria University 20 percent or 173 of 865 funded-researchers received As, while at the University of Otago the figure was 17 percent or 229 out of 1358 researchers.

Moving further out, AUT has the fastest relative increase in funded researchers; Massey has a 40% increase in those at the top level; and Universities New Zealand astonishes everyone by pointing out universities did a lot more research than polytechs and wānanga.

The problem with these rankings …. the two problems with these rankings…. among the problems with these rankings are the fineness of the distinctions needed (well below the precision of the measurement process), the way everyone suggests the ranking they do best on is most important, and the effort wasted on tuning the system to target particular rankings.  Perhaps most importantly, though, there’s a corrosive effect of university leaders pretending that something is important and quantitatively well-founded when they know it isn’t, or at best, know they don’t have good evidence that it is.

Fred Vultee, an academic and former editor has written about an analogous problem for newspapers

It’s a free country, and Congress shall make no law abridging, &c &c &c, so you’re perfectly entitled to do whatever you want with your “analysis” time and your frontpage space. In some cosmic sense, sharing popular delusions about the lottery isn’t any worse than publishing the horoscopes. But if you’d like to be taken seriously when you proclaim “study says” or “poll reveals,” you need to run a disclaimer with your “want to hit the lottery” tales

avatar

Thomas Lumley (@tslumley) is Professor of Biostatistics at the University of Auckland. His research interests include semiparametric models, survey sampling, statistical computing, foundations of statistics, and whatever methodological problems his medical collaborators come up with. He also blogs at Biased and Inefficient See all posts by Thomas Lumley »