Archive for December, 2014

Research Intensity Rankings in Theology & Religious Studies

As all academics in the UK are aware, the REF results were published this week. This is a major research assessment exercise conducted by the research funding bodies and the UK government periodically, that takes into account research quality, but also newer measures like environment and impact. The results were analysed almost immediately by numerous outlets, but the THES rankings and analysis are likely to prove influential steers to how the data is publicly perceived, however that data might eventually translate into funding decisions.

In Theology & Religious Studies, there were some expected results – Durham did very well once more – and some surprises. Among the latter, I was surprised to see Oxford rank 12th in the THES ranking. One reason I was surprised is that we had the largest submission of FTE staff submitted (33) with a relatively strong research score overall, though we did less well on impact and environment, issues we’ll no doubt ponder over the course of the next REF cycle.

But I was also curious to see this article about research intensity. Some departments choose to submit only a portion of their staff with very strong research profiles, while other departments submit virtually all of their core people. I was curious as to how the rankings might look if we tested for ‘research intensity’, that is, a THES-like GPA expressed as the ratio of eligible to submitted staff multiplied by research output scores. So I thought I would do a back-of-the-envelope calculation to see how it would look for Theology & Religious Studies.

In the interests of ‘showing my work’, these are the numbers from which I worked. Taking the THES rankings for each university (expressed parenthetically in the list that follows after institutional name), I noted the number of FTE staff submitted, followed by the number eligible (the latter using figures from HESA, the Higher Education Statistics Agency). Expressing this ratio as a percentage, I then multiplied this by the THES Output GPA (the blue second column in the THES table), and expressed this as a new GPA for research intensity, for lack of a better term.

  • Durham (1), FTE submitted: 25; eligible: 27; percentage submitted: 93% x Output GPA, 3.11 = 2.89
  • Birmingham (2); 9; eligible: 15; 60% x 3.15 = 1.89
  • Lancaster (=3); 22; eligible: 22; 100% x 2.80 = 2.80
  • Leeds (=3); 11; eligible: 12; 92% x 3.15 = 2.90
  • UCL (=3); 7; eligible: 9; 78% x 3.07 = 2.39
  • Cambridge (6); 24; eligible: 27; 89% x 2.87 = 2.55
  • Kent (7); 8; eligible: 9; 89% x 2.89 = 2.57
  • Edinburgh (8); 27; eligible: 30; 90% x 2.89 = 2.60
  • KCL (9); 26; eligible: 36; 72% x 2.84 = 2.04
  • Cardiff (10); 9; eligible: 13; 69% x 3.05 = 2.10
  • Soas (11); 14; eligible: 19; 74% x 3.02 = 2.23
  • Oxford (12); 33; eligible: 32; 103% x 3.06 = 3.15
  • Nottingham (=13); 16; eligible: 16; 100% x 2.97 = 2.97
  • Exeter (=13); 11; eligible: 12; 92% x 2.90 = 2.67
  • Manchester (15); 15; eligible: 17; 88% x 2.71 = 2.38
  • St Andrews (=16); 14; eligible: 20; 70% x 2.90 = 2.03
  • Sheffield (=16); 4; eligible: 6; 67% x 3.00 = 2.01
  • Aberdeen (18); 19; eligible: 20; 95% x 2.54 = 2.41
  • Bristol (19); 9; eligible: 8; 113% x 2.91 = 3.29
  • Heythrop (20); 16; eligible: 40; 40% x 2.54 = 1.01
  • Open (21); 6; eligible: 9; 67% x 2.86 = 1.91
  • Wales Trinity Saint David (22); 8; eligible: 7; 114% x 2.47 = 2.82
  • Glasgow (23); 11; eligible: 13; 85% x 2.38 = 2.02
  • Canterbury Christ Church (24); 6; eligible: 8; 75% x 2.55 = 1.91
  • Roehampton (25); 7; eligible: 7; 100% x 2.64 = 2.64
  • Liverpool Hope (26); 15; eligible: 18; 83% x 2.41 = 2.00
  • Chester (27); 11; eligible: 17; 65% x 2.40 = 1.56
  • Winchester (28); 8; eligible: 8; 100% x 2.19 = 2.19
  • Gloucestershire (29); 5; eligible: 10; 50% x 2.17 = 1.09
  • St Mary’s, Twickenham (30); 5; eligible: 9; 56% x 2.29 = 1.28
  • York St John (31); 7; eligible: 14; 50% x 2.23 = 1.12
  • Leeds Trinity (32); 4; eligible: 4; 50% x 2.20 = 1.10
  • Newman (33); 2; eligible: 5; 40%; n/a

If we then turn this into a new ranking order, we get the following:

  • 1 Bristol (3.29)
  • 2 Oxford (3.15)
  • 3 Nottingham (2.97)
  • 4 Leeds (2.90)
  • 5 Durham (2.89)
  • 6 Wales Trinity Saint David (2.82)
  • 7 Lancaster (2.80)
  • 8 Exeter (2.67)
  • 9 Roehampton (2.64)
  • 10 Edinburgh (2.60)
  • 11 Kent (2.57)
  • 12 Cambridge (2.55)
  • 13 Aberdeen (2.41)
  • 14 UCL (2.39)
  • 15 Manchester (2.38)
  • 16 Soas (2.23)
  • 17 Winchester (2.19)
  • 18 Cardiff (2.10)
  • 19 KCL (2.04)
  • 20 St Andrews (2.03)
  • 21 Glasgow (2.02)
  • 22 Sheffield (2.01)
  • 23 Liverpool Hope (2.00)
  • =24 Open (1.91)
  • =24 Canterbury Christ Church (1.91)
  • 26 Birmingham (1.89)
  • 27 Chester (1.56)
  • 28 St Mary’s, Twickenham (1.28)
  • 29 York St John (1.12)
  • 30 Leeds Trinity (1.10)
  • 31 Gloucestershire (1.09)
  • 32 Heythrop (1.01)

* n/a Newman

I’ll be the first to admit that this system isn’t without problems. As HESA notes on their site, Oxford and Cambridge have the added complication of eligible academics with non-university (i.e., college-only) posts, and there is the further problem that in two cases (Bristol and Wales Trinity Saint David) small departments submitted one more FTE than apparently expected, and so their percentages go way up. And of course the accuracy of HESA’s figures for FTE staff could certainly be disputed. Finally, it’s also worth noting that the REF is not simply measuring research outputs, and so this would only affect one portion of the exercise’s concern.

Nevertheless, even bearing in mind all those caveats, I think looking at the data in this way still does yield some interesting results.

UPDATE: In light of some pushback I’ve had, I’d like to clarify that for me these results serve to underscore the artificiality of the whole REF process, and the problematic way in which one interpretation of the data is used against other departments in publicity wars, etc. In my view, the REF problematically limits the types of projects people pursue, causes conflict in the small TRS sector, and places unnecessary strain on already thin resources. I understand that research councils wish to find ways of exercising accountability for their funding, but the whole sector would be better served by less heavy-handed methods and, in general, by much more extensive investment in research – not least in the neglected humanities – by the government as a whole.

7 Comments