A Note on Some of Our Autumn Pulse Data
- A change to our Pulse methodology temporarily inflated positive response rates for some questions between August and October
- Trends within these periods are still accurate
- This means that confidence in PG study was increasing (counter-intuitively) in the autumn, but wasn’t as high as it appeared
We’ve been running our Postgraduate Pulse tracker for almost two years now and the thousands of monthly responses we’ve had have built up a unique longitudinal picture of the prospective Masters and PhD audience during a hugely important time for postgraduate study.
Most of the time that data tells us things we broadly expect to see – such as the cyclical shift from autumn to spring start dates or the (welcome) drop in Covid concern amongst prospective student audiences.
Sometimes it tells us things that surprise us or make us think, such as the persistent interest in online study for older UK students or the fact that people will travel much further to attend a PhD open day than they will for a Masters.
Either way, it’s my job to ensure the insights we extract from that data and the stories we tell with it are as accurate and useful as possible.
Which is why, in this case, I need to highlight something in our autumn data.
The cost of living crisis had a bigger impact than we thought
... but domestic PG confidence was increasing until recently.
From July to August we saw a sharp increase in the proportion of people feeling positive about UK Masters and PhD study. This seemed difficult to square with the current economic circumstances, especially as we’d seen confidence drop during the UK cost of living crisis.
Well, it turns out that a minor change in our survey methodology may have unwittingly ‘inflated’ these results by introducing a bias towards more positive answers. I’ve explained this below as a (hopefully) useful lesson for anyone designing their own surveys.
This temporary error means that the ‘jump’ in domestic confidence from July to August is inflated.
However, this doesn’t change the upward trend we’d seen during August, September and October: the UK domestic audience was getting more confidence about postgraduate study in the autumn, even if they were never as confident as the inflated data suggested.
This means that my previous discussion and conclusions regarding the impact of the cost of living crisis on PG study are still broadly relevant (I’ve made updates to caveat some conclusions, where necessary).
So, what happened with the survey?
Pulse uses five-point Likert scales for a number of questions on attitudes to different aspects of postgraduate study. This allows us to build up a longitudinal picture of trends over time.
Likerts aren’t perfect, however, with a number of biases at play depending on how the order of response options (ranging from 'very positive' to 'very negative', or similar) is presented.
There’s a wealth of academic literature on this topic (if you’re so inclined) but there’s a fairly simple summary here. Put simply, respondents can be influenced by:
- Primacy (we like to select the first thing we see)
- Left-side selection (people who read from left-to-right prefer options at the left)
- Acquiescence (we prefer to agree with statements)
- Social desirability (we pick responses we feel are normative or socially desirable)
The most neutral way to present a Likert is therefore in ascending order from negative to positive; this attempts to balance Primacy and Left-side selection (at the left) against Acquiescence and Social desirability (at the right).
A descending Likert puts all biases at the left and therefore inflates that end of the scale. This is what temporarily happened to Pulse between August and November. It’s likely to have inflated some responses during that period relative to other months. Again, the trend within those months is still accurate.
So, the moral of the story is, if you’re using Likerts for your own surveys (e.g. student satisfaction) ideally make the order ascending – and don’t change it!