How Britain's pollsters will fix 'shy Tory' problem after Cameron's shock election win
The voting polls in the months leading up to the UK general election on 7 May told a uniform tale of deadlock, with the Conservative and Labour parties tied, with around one third of the electorate each, and neither making any headway in gaining votes from the other third of voters, who continued throughout to insist that they were going to back one of smaller parties.
Nothing in any of the party campaigns seemed to make any impact at all; there were nearly 500 voting polls published in the six months before the election and a trend-line through them would be completely horizontal.
The media – bored with an election campaign in which nothing seemed to be happening – switched their focus, weeks before the election, to the question of what would happen after it. Pollsters, psephologists and pundits agreed almost unanimously that Britain was headed for a 'hung' Parliament, in which no party could govern alone.
In many scenarios, the only combination of parties adding up to an overall majority in the House of Commons looked likely to be the Labour Party relying on the support of the Scottish National Party (SNP), a distasteful probability for many voters in England, and one that only arose from what the polls all predicted would be the total evisceration of Labour by the SNP.
Then, at the stroke of 10 p.m. on election night, as the polls closed, the exit poll (a collaboration of all the main TV stations) was released. To almost universal astonishment it predicted the Conservative Party winning 316 MPs - short of an overall majority, but so close to one that a Conservative government would be inevitable. Labour, the exit poll reported, had lost ground, gaining just 239 MPs, while the Liberal Democrats, after five years in government, as the junior party in coalition with the Conservatives, were on course to be almost wiped out – reduced from 56 MPs to just 10.
Numerous politicians and pundits instantly denounced the exit poll, some vowing to eat various different items of clothing if it was right. It wasn't exactly right - the Conservatives had done even better (finishing on 330 MPs, an effective majority of 12 over all other parties) and Labour and the Liberal Democrats had done even worse – but the exit poll had captured very accurately the basic contours of the result, which was an utterly different reality from what the pre-election polls had suggested.
Far from the boring election everyone had been complaining about, it was a night of high drama culminating, the next morning, in the resignation of three different party leaders within an hour, and an instant clamour of criticism directed at the pollsters, whose unchanging picture of voter opinion had proved to be almost completely wrong.
We have been here before. At the 1992 UK general election, the pre-election polls also turned out to be completely wrong, to the point that they predicted the wrong outcome. All of the 1992 election polls both exaggerated Labour support and understated Conservative support, and by much more than their theoretical margins of error; 2015 produced exactly the same pattern of error.
After the 1992 disaster for the pollsters an inquiry was set up to find out what had gone wrong and determine what changes were needed in polling methods. By lunchtime the day after the 2015 disaster for the pollsters, an equivalent inquiry had been set up, to be chaired by a world-leading academic expert in research methods.
The post-1992 inquiry identified four key reasons why the polls had got the election wrong: very late switching by voters, demographic bias in poll samples (in those days mainly still face-to-face), politically unrepresentative poll samples, and a 'spiral of silence' among what became known as 'shy Tories' (voters who didn't want to say that they were going to vote Conservative because they perceived it to be an unfashionable, minority point of view).
There's nothing pollsters can really do to account for voters who don't decide until the last moment which party they're going to back – so pre-election polls will inevitably miss very late swing. But major changes were introduced to address the other issues. There was a wholesale switch to telephone sampling, which could produce more demographically representative samples.
A new past-vote weight was introduced, to ensure that samples were politically as well as demographically representative. And an adjustment formula was developed to counter spirals of silence, should these be evident in the data.
20 years of success
These changes worked. For over twenty years – spanning four general elections, five European Parliamentary elections and countless by-elections and mayoral elections - the post-1992 methodology delivered accurate results. We can see clearly now that at some point between the 2010 election (which the polls got about right) and the 2015 election (which they got seriously wrong) it ceased to do so.
'Suggestions that Ed Miliband was "too left wing" do not fit well with the poll findings in Scotland that he was "not left wing enough" - a perception that the Scottish National Party was able to capitalise on.'
Read William Keegan's take on the future of the Labour party here.
The review of what went wrong will be exhaustive and rigorous – and it is much too soon to be able to state with certainty what caused the errors. There is clear evidence that 'shy Tories' were again part of the problem. Some pre-election polls had, in the weeks before the election, identified a group amounting to around 8% of the electorate, who said they weren't going to vote Conservative, but when asked a lot of further questions, eventually admitted that even though they didn't want to vote Conservative they thought they were quite likely to end up doing so – as the least bad option.
It is likely that a lot of these people did indeed end up voting for David Cameron's Conservatives, but never reached the point of being willing to say that's what they were going to do. It is possible that additional questions will be added to voting poll questionnaires in future to give additional context to declared vote intentions.
The inquiry will also look at the way polls currently estimate likelihood to vote. Because people are conditioned to think that they should vote, they tend to over-estimate their own probability of doing so. If only two thirds will actually vote – which is what happened at the 2015 UK election and the previous one – which two thirds makes a huge difference to the poll numbers.
The inquiry will also consider whether online polls capture a wholly representative slice of the population and examine whether those who are willing to participate in polls, whether on the phone or the internet, are politically representative of demographically identical voters who choose not to participate in polls.
The product of these investigations will be a period of experimentation, as new methodologies are devised and tested. It is possible we may end up adopting a fundamentally different approach – and, initially at least, that different polling organisations may pursue radically different methodologies. And we won't know, until these post-2015 methods are tested in battle at elections over the next few years, whether the industry has succeeded in fixing the problems, and restoring their record of accuracy, as they did the last time they got a general election wrong.
Baron Cooper of Windrush is founder of the research and strategy consultancy Populus, which regularly uses polling to take the pulse of the British public. He is also a Conservative life peer in the House of Lords and previously served as an advisor to David Cameron on issues such as gay marriage. You can find him on Twitter @AndrewCooper__
© Copyright IBTimes 2024. All rights reserved.