FYI.

This story is over 5 years old.

Here Be Dragons

How Did Everyone Get the UK Election So Wrong?

The public, the media, and the pollsters all talked themselves into believing a lie.

Voters vote in Hackney. Photo via Wikicommons user Alex Lee

This article first appeared on VICE UK.

Imagine being John Curtice. He was one of the five experts responsible for Thursday night's exit poll, which predicted a Tory win but totally disagreed with 140 other polls by half a dozen other pollsters this year. To be exact, he was the one who had to go on the BBC just after ten 'o' clock and defend it to incredulous pundits. He looked about as happy as a servant having to tell Kim Jong-un why his Pop-Tarts got burnt.

Advertisement

"Bollocks to it," was my reaction on Twitter, "I've never believed an individual poll before and I'm not going to start now." Dozens of opinion polls in recent years had shown a hung parliament with a Labour-SNP advantage, so for that to suddenly disappear on the night seemed pretty unlikely.

I was hilariously wrong. The Tories won a clear majority, in the biggest election night upset for at least a generation. Waking up in the morning I had that disconnected feeling when you load a game and you realize you're playing the wrong save file. I couldn't work out how we'd got to this parallel universe.

It's hard to grasp the scale of the failure here. Half a dozen independent companies carried out almost 2,000 polls since the last election, 140 since Christmas, and in one stroke they meant jack shit. Another half-dozen election forecasters, wannabe Nate Silvers trying to make a name for themselves, used those polls to forecast who would win what seats. Their predictions were complete nonsense, as were the bookies'.

Hundreds of pundits and journalists took those figures and spun a "too-close-to-call" narrative out of them that literally shaped our understanding of reality for the last six months. Every news report, every editorial and opinion column, every tweet and Facebook post lined up behind the same prediction. Entire election campaigns were redirected. Journalists openly ridiculed David Cameron—now the smuggest man in the country—over his claims he could win a majority. After all of that, the exit poll was like physicists finding out that gravity goes upwards.

Advertisement

Where did it all go wrong?

I can't tell you. The truth is, nobody knows. Finding out could take months or years of research, and a major inquiry has already been announced. What we can do though is start narrowing it down, ruling some things out. And while I can't say exactly what went wrong, there is one theory that sticks out of all this like a sore thumb.

First, let's start with what it wasn't. The actual result of the election was 37 percent of the vote to the Tories and 30.5 percent to Labour. The polls were steady around 33/34 for both of them for months, even on election day, so it wasn't a last-minute swing. It wasn't a random error, either, because hundreds of polls went the same way. It also wasn't about online polling vs. phone polling—the results of both types of survey were pretty much the same, and completely wrong.

The exit poll gives us one clue, as Anthony Wells of UKPollingReport put it: "Given that the pre-election polls were wrong but the exit polls were right, how pollsters measure likelihood is definitely worth looking at. Exit polls obviously don't have to worry about likelihood to vote; they only interview people physically leaving a polling station." Maybe Labour supporters were far less likely to vote than the pollsters assumed, and more Tories turned out on the day?

Polling companies know about these kinds of problems, though—raw numbers from polls are usually pretty wrong, so they "correct" them. For example, they know that fewer people say they'll vote Conservative than actually do, something called the "Shy Tory" effect. To get rid of that bias they tweak the number a bit, based on results from previous elections. They also have a rough idea of which way swing-voters tend to break, so a percentage of people who say "don't know" are reallocated to the different parties. Usually these tricks work pretty well, but at this election they failed miserably. The Number Cruncher Politics blog actually raised this possibility before the election, and they look pretty smug now.

Advertisement

Why? Some of it could just be that this election was just so different from any other in recent history, with the rise of two new major parties—the SNP and UKIP—and a surge in votes for the Greens. The thing is, the results predicted for those parties (and the Lib Dems) were actually pretty accurate—it was the figures for Labour and Conservative vote shares that were completely wrong.

There's one last clue, though, which I think might turn out to be at the heart of all this, and it was revealed in a cowardly blog post published by Survation the day after the election. They revealed that in fact they'd carried out a poll a couple of days before that showed almost exactly the right result—37 percent for the Tories, 31 percent for Labour. It was so different from other polls, though, including their own, that they were too scared to publish it.

My gut feeling—and this is really just a guess on my part—is this: Sometimes a story gets so stuck in people's minds that it kind of warps reality around it. Everyone assumed that the next election would close and we'd be in a new era of coalition governments. I have a hunch that polling companies probably had the data they needed to contradict this, but they were stuck in a feedback loop with each other and the media, everyone persuading each other that the accepted wisdom had to be right. Everyone fell neatly into line, and anything contradicting the story was either explained away or, like Survation's "rogue" poll, just thrown in the bin.

Advertisement

There's a big irony in all this. In the wake of Nate Silver's success predicting the 2012 US election results, a crop of new sites like New Statesman's May2015.com appeared, where ambitious data journalists produced ever more convoluted essays about who was or wasn't winning the election. They were supposed to represent a new wave of more intelligent political analysis, but in reality all they did was recycle the exact same poll-driven bullshit we'd seen in the past.

May2015's final prediction showed the Tories with five more seats than Labour. In reality, they finished 99 seats ahead. the Guardian were even worse, predicting a dead heat. Even the best forecaster came almost 50 seats short of where the Tories would end up. That's just embarrassingly wrong, and some serious groveling needs to happen if people are going to take these sites seriously in the future.

Then again, why do we give such a shit in the first place? This was the most unreal campaign I can remember in my lifetime. Two focus-group-obsessed candidates with the character of assembly line robots toured the country in hermetically sealed buses repeating meaningless phrases that nobody gave a shit about. They approached the public the way a germaphobe enters a pub toilet. Each question was evaded, every meaningful issue avoided like a plague pit.

Journalists could have tried to tackle that, to pierce the bubble and find some kind of meaning in all of it, but instead they spent six months obsessing over numbers that turned out to be bullshit. When you think about it, it explains quite a lot.

Follow Martin on Twitter.