Research Intensity Rankings in Theology & Religious Studies

As all academics in the UK are aware, the REF results were published this week. This is a major research assessment exercise conducted by the research funding bodies and the UK government periodically, that takes into account research quality, but also newer measures like environment and impact. The results were analysed almost immediately by numerous outlets, but the THES rankings and analysis are likely to prove influential steers to how the data is publicly perceived, however that data might eventually translate into funding decisions.

In Theology & Religious Studies, there were some expected results – Durham did very well once more – and some surprises. Among the latter, I was surprised to see Oxford rank 12th in the THES ranking. One reason I was surprised is that we had the largest submission of FTE staff submitted (33) with a relatively strong research score overall, though we did less well on impact and environment, issues we’ll no doubt ponder over the course of the next REF cycle.

But I was also curious to see this article about research intensity. Some departments choose to submit only a portion of their staff with very strong research profiles, while other departments submit virtually all of their core people. I was curious as to how the rankings might look if we tested for ‘research intensity’, that is, a THES-like GPA expressed as the ratio of eligible to submitted staff multiplied by research output scores. So I thought I would do a back-of-the-envelope calculation to see how it would look for Theology & Religious Studies.

In the interests of ‘showing my work’, these are the numbers from which I worked. Taking the THES rankings for each university (expressed parenthetically in the list that follows after institutional name), I noted the number of FTE staff submitted, followed by the number eligible (the latter using figures from HESA, the Higher Education Statistics Agency). Expressing this ratio as a percentage, I then multiplied this by the THES Output GPA (the blue second column in the THES table), and expressed this as a new GPA for research intensity, for lack of a better term.

  • Durham (1), FTE submitted: 25; eligible: 27; percentage submitted: 93% x Output GPA, 3.11 = 2.89
  • Birmingham (2); 9; eligible: 15; 60% x 3.15 = 1.89
  • Lancaster (=3); 22; eligible: 22; 100% x 2.80 = 2.80
  • Leeds (=3); 11; eligible: 12; 92% x 3.15 = 2.90
  • UCL (=3); 7; eligible: 9; 78% x 3.07 = 2.39
  • Cambridge (6); 24; eligible: 27; 89% x 2.87 = 2.55
  • Kent (7); 8; eligible: 9; 89% x 2.89 = 2.57
  • Edinburgh (8); 27; eligible: 30; 90% x 2.89 = 2.60
  • KCL (9); 26; eligible: 36; 72% x 2.84 = 2.04
  • Cardiff (10); 9; eligible: 13; 69% x 3.05 = 2.10
  • Soas (11); 14; eligible: 19; 74% x 3.02 = 2.23
  • Oxford (12); 33; eligible: 32; 103% x 3.06 = 3.15
  • Nottingham (=13); 16; eligible: 16; 100% x 2.97 = 2.97
  • Exeter (=13); 11; eligible: 12; 92% x 2.90 = 2.67
  • Manchester (15); 15; eligible: 17; 88% x 2.71 = 2.38
  • St Andrews (=16); 14; eligible: 20; 70% x 2.90 = 2.03
  • Sheffield (=16); 4; eligible: 6; 67% x 3.00 = 2.01
  • Aberdeen (18); 19; eligible: 20; 95% x 2.54 = 2.41
  • Bristol (19); 9; eligible: 8; 113% x 2.91 = 3.29
  • Heythrop (20); 16; eligible: 40; 40% x 2.54 = 1.01
  • Open (21); 6; eligible: 9; 67% x 2.86 = 1.91
  • Wales Trinity Saint David (22); 8; eligible: 7; 114% x 2.47 = 2.82
  • Glasgow (23); 11; eligible: 13; 85% x 2.38 = 2.02
  • Canterbury Christ Church (24); 6; eligible: 8; 75% x 2.55 = 1.91
  • Roehampton (25); 7; eligible: 7; 100% x 2.64 = 2.64
  • Liverpool Hope (26); 15; eligible: 18; 83% x 2.41 = 2.00
  • Chester (27); 11; eligible: 17; 65% x 2.40 = 1.56
  • Winchester (28); 8; eligible: 8; 100% x 2.19 = 2.19
  • Gloucestershire (29); 5; eligible: 10; 50% x 2.17 = 1.09
  • St Mary’s, Twickenham (30); 5; eligible: 9; 56% x 2.29 = 1.28
  • York St John (31); 7; eligible: 14; 50% x 2.23 = 1.12
  • Leeds Trinity (32); 4; eligible: 4; 50% x 2.20 = 1.10
  • Newman (33); 2; eligible: 5; 40%; n/a

If we then turn this into a new ranking order, we get the following:

  • 1 Bristol (3.29)
  • 2 Oxford (3.15)
  • 3 Nottingham (2.97)
  • 4 Leeds (2.90)
  • 5 Durham (2.89)
  • 6 Wales Trinity Saint David (2.82)
  • 7 Lancaster (2.80)
  • 8 Exeter (2.67)
  • 9 Roehampton (2.64)
  • 10 Edinburgh (2.60)
  • 11 Kent (2.57)
  • 12 Cambridge (2.55)
  • 13 Aberdeen (2.41)
  • 14 UCL (2.39)
  • 15 Manchester (2.38)
  • 16 Soas (2.23)
  • 17 Winchester (2.19)
  • 18 Cardiff (2.10)
  • 19 KCL (2.04)
  • 20 St Andrews (2.03)
  • 21 Glasgow (2.02)
  • 22 Sheffield (2.01)
  • 23 Liverpool Hope (2.00)
  • =24 Open (1.91)
  • =24 Canterbury Christ Church (1.91)
  • 26 Birmingham (1.89)
  • 27 Chester (1.56)
  • 28 St Mary’s, Twickenham (1.28)
  • 29 York St John (1.12)
  • 30 Leeds Trinity (1.10)
  • 31 Gloucestershire (1.09)
  • 32 Heythrop (1.01)

* n/a Newman

I’ll be the first to admit that this system isn’t without problems. As HESA notes on their site, Oxford and Cambridge have the added complication of eligible academics with non-university (i.e., college-only) posts, and there is the further problem that in two cases (Bristol and Wales Trinity Saint David) small departments submitted one more FTE than apparently expected, and so their percentages go way up. And of course the accuracy of HESA’s figures for FTE staff could certainly be disputed. Finally, it’s also worth noting that the REF is not simply measuring research outputs, and so this would only affect one portion of the exercise’s concern.

Nevertheless, even bearing in mind all those caveats, I think looking at the data in this way still does yield some interesting results.

UPDATE: In light of some pushback I’ve had, I’d like to clarify that for me these results serve to underscore the artificiality of the whole REF process, and the problematic way in which one interpretation of the data is used against other departments in publicity wars, etc. In my view, the REF problematically limits the types of projects people pursue, causes conflict in the small TRS sector, and places unnecessary strain on already thin resources. I understand that research councils wish to find ways of exercising accountability for their funding, but the whole sector would be better served by less heavy-handed methods and, in general, by much more extensive investment in research – not least in the neglected humanities – by the government as a whole.

Advertisements
  1. #1 by vincent on December 19, 2014 - 11:59 pm

    Reblogged this on Talmidimblogging.

  2. #2 by petermhead on December 20, 2014 - 10:41 am

    Doesn’t this just show that it is possible to tweak and manipulate the data in so many different ways that it is possible to find a way to make any institution look good? Select the area within the REF in which Oxford scored best; multiply by new factor in which Oxford scores best; come out with new ranking which brings Oxford up the ladder.

    I wonder about a more fundamental critique of the whole thing.

    • #3 by David Lincicum on December 20, 2014 - 10:51 am

      Thanks for this comment, Pete. Don’t get me wrong, I wouldn’t want to be seen as endorsing the REF as a good system, or an accurate depiction, at least in any detail, of where good work is being done. I was aware that this could look like stacking the deck in Oxford’s favour, and in a way that’s a natural result, since my impetus to undertake this comparison was due to my surprise at Oxford’s not having been placed higher in the THES ranking. So as much as anything this is a critique of the THES ranking system and their interpretation of the admittedly highly imperfect REF exercise. There’s also the additional complication that I don’t know how many non-university post-holders inflated Oxford’s FTE numbers, though I *think* it’s the case that we submitted all our university people. I admit that I have lots of admiration for the excellent work done in places like Birmingham, but it also seems anomalous to have them rank second in the overall score while only submitting 60% of their eligible researchers. It’s a way to game the system, and that’s entirely fair given the system as it is. But there is a question of the public use and perception of THES rankings that troubles me. So I wouldn’t want to claim too much for Oxford based on this at all, though in general I still think it’s worth doing as a way of defusing some overhasty implications one could draw from a casual look at the results.

  3. #4 by petermhead on December 20, 2014 - 12:05 pm

    Thanks David, One of the problems with the REF is in the various options available to institutions in considering who to enter, which outputs, who to exclude from REF etc. And a key part of gaming the entry is deciding how many people to submit. It would be interesting to calculate how much time was spent on this sort of question across all these institutions. (well, not interesting enough to actually do it!) Well done to Oxford for putting in the most FTE of anyone. You’ll get the benefit in the actual funding provided (I presume this is a multiplier as in previous RAEs). And it suggests that overall Oxford is the strongest faculty in the field.

    By the way, I’m not sure it is correct to say that Oxford is 12th in the THES interpretation of the REF; since it is the REF table itself that simply uses its GPA (without any factoring of number of entrants) to rank the institutions. Although of course there are plenty of ways to “interpret” the results in ways that reflect positively on any particular institution. (Well, not every institution) (And indeed it is hard to find a way to get Cambridge up much higher than 5th or 6th unless you only consider faculties which teach across the whole range of theology and religious studies)

    • #5 by David Lincicum on December 20, 2014 - 1:38 pm

      Thanks so much for this excellent interaction. I agree entirely with your assessment of the problematic nature of the enterprise.

      And I might be confused, but I thought the GPA system was simply THES’s way of calculating a ranking, since the REF results themselves (http://results.ref.ac.uk/(S(3a0olm30xjpe5knwxeavvhju))/) don’t seem to have it. But I might be completely confused?

  4. #6 by Professor Richard King on December 20, 2014 - 2:54 pm

    Your statistics for the University of Kent Religious Studies are inaccurate. Kent got a GPA of 3.11 and with 89% submitted that gives an intensity rating of 2.77. I haven’t checked the others. Of course intensity still favours larger staffing units because it is much easier to generate larger percentiles whereas small department become adversely affected by the non-submission of one or two members of staff.

    • #7 by David Lincicum on December 20, 2014 - 2:58 pm

      Thanks very much for your comment, Professor King. I agree that this measurement also tends to favour larger units, and I wouldn’t at all wish to use this ranking style to argue any particular point, other than the artificiality of the whole process. As for the GPA, you are entirely correct that Kent’s overall THES GPA is 3.11, but I am only using the ‘Output GPA’ in the second blue column of the PDF of the THES results, since I have supposed that any measure of research intensity should primarily have that figure in view rather than the entire calculation – though perhaps that is an error in judgment. Thanks once more.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: