UK Election – Why did the polls get it so wrong?

For Labour it was a disappointment, for the Liberal Democrats a disaster – and for the pre-election pollsters a debacle. The UK Election campaign was awful but the results were spectacular, delivering a verdict as markedly different from expectations as 1992.

I would suggest the polls got it wrong for a completely different reason from all those I have hitherto seen suggested.

First, the back story…

The pollsters’ failure was surely even more stunning in 2015 than in 1992, as we had been led to believe (and I was as naive as any in this regard) that the pollsters had sorted it it out since then, and that their projections – the basis also for “forecasts” from the academically minded such as the famous US analyst Nate Silver – were now highly reliable. This point was not seriously questioned – indeed, the BBC even did a Panorama programme based on the assumption of reliability!

It turns out they were no more reliable than they were in 1992.

Indeed, albeit in retrospection, a detailed check through the records shows that pre-election polls have consistently overstated the Labour vote at every single election even since 1992 (and indeed back at least to the 1974) – regardless of incumbency. The same polls generally understate the Conservative vote, but not always. Not so much “Shy Tory” as “Excitable Labour”, then…

To some degree, we have simply missed this. Labour’s 43% in 1997 delivered the expected landslide – but actually the polls, even the exit poll, suggested it would be significantly higher. Polls in 2001 suggested an increased majority even on that landslide, but the result was a slight decrease. Even in 2010, hardly a single poll put Labour below 30% – where they actually were. In other words, the polls have been in error for some time but, due to a combination of one-sidedness and luck, 2015 happened to be the first time since 1992 that this materially affected the implicit prediction about who could command a parliamentary majority.

That said, 2015’s polls (taking the poll-of-polls immediately before the election as the guide) were also the most wrong since 1992, suggesting an absolutely even outcome between the Conservatives and Labour when the former were actually six points ahead. The usual discrepancy is about half of that.

So, why were they so wrong a generation on from the 1992 debacle?

Having taken some time to check these things, I would suggest it is fundamentally to do with methodology.

In the UK, pollsters still get straight to the point. Generally what happens is someone visits your home, or phones, or even contacts you over the Internet (Internet polls were particularly wrong, by the way) and more or less immediately asks you for your voting intention. It does not materially matter how they phrased this question (in fact, interestingly, the “Ashcroft Polls” asking merely for preferred party were more accurate than those which asked the identity of the likely candidates to be borne in mind); what matters is they get straight to it.

I work essentially in market research (I call it “PR”, but increasingly it is more like market research), assessing how campaign messages or new services would be received. In that case, however, the last thing I ask is “What do you think of this?” (or, politically, “How are you going to vote?”).

Let us consider the development of Apple’s iPad. I did not do the market research for this, but I do happen to know those who did. Cast your mind back about ten years, pre-iPad. If someone had immediately shown you an iPad and said “Could you find a use for this?” you would almost certainly have laughed – we already had laptops for desk work and mobiles in our pocket, so what on earth would we need that awkward looking thing for? If they had suggested that you may want to part with a small fortune in order to own one, you may have started to get quite worried about them…

So the researchers for Apple’s iPad did not ask the straight question. Instead, they spent time getting to know people – their lifestyle, their daily routine, how they relaxed, how they worked, how they interacted, and so on. Never once during this work would they ask anything even remotely related to technological equipment, report writing, online reading or whatever. They would spend entire focus groups, entire research days, entire mini-projects without going anywhere near any product (or service) that Apple offered.

Having done that “lifestyle research”, they then designed a device which they felt would fit into people’s daily lives, and that people would pay significant amounts of money for. They developed a marketing plan accordingly, and hey presto – we have an iPad.

I know rather less about polling specifically (I specialise only in exit polling – which came out rather well on 7/8 May, as it happens!), but I am led to understand that polling in North America follows a similar procedure to Apple’s research, albeit of course in a much shorter time frame. Rather than going straight out and demanding to know how someone is going to vote, the pollster asks some general lifestyle questions – designed, essentially, to get the respondent to be in the same frame of mind they will be in when they enter the polling booth.

For example – and, beware, one man’s educated guesswork is another’s ill-informed conjecture – they may find out that the respondent set up a mobile nail bar a few years ago but has just invested in an office to run it from; or that a builder is back to having to turn down work having struggled to find any five years ago; or that a taxi driver is getting significantly more fares of an average Saturday night.

Let us then compare two pollsters. The typical UK pollster does not bother finding out all of that, and instead asks the nail bar owner/builder/taxi driver straight out who they are going to vote for – “Well, I don’t know, I mean, obviously, I’m not exactly one of those bankers, so, you know, I may give Labour a try.” However, a typical US pollster finds out all of that and then asks who they are going to vote for – “Well, you know, as I said things are getting better now so I thought, probably, I’ll just give the Conservatives another try.”

You can instantly see the difference. In the second case, the respondent is much nearer in attitude to where they will be in the privacy of the polling booth, and also feels more willing to declare for the Conservatives given that the reason is self-evident from the discussion they have just had.

If you ask the wrong question or, at least, you omit to ask the right question, then the respondent will probably omit to give you the right answer. I am not a pollster, I emphasise, but I cannot help but think the answer lies in there somewhere.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: