General Election polling: a matter of opinion?
According to the polling industry, the 2015 General Election was supposed to be the closest in years, making a hung parliament a virtual certainty. The average of polls consistently gave the Conservatives and Labour a roughly even vote share of 33% to 35%; but in the end, the Conservatives emerged with a lead of 6.5 percentage points and an overall majority in the Commons. How did the pollsters get it so wrong and what lessons might be learned for next time?
How did the polls fare?
Since General Election polling started in 1945 there have been a number of upsets, most notably in 1992, when the Conservatives' lead over Labour was underestimated by nearly 9 percentage points: the worst performance in UK polling history.
The polls were not quite as far off the mark this time. For the smaller parties, including the Liberal Democrats, UKIP, Greens and 'Others', they performed reasonably well. However, it in predicting the crucial Conservative and Labour vote shares that the polls were largely mistaken.
The chart below summarises the results of nine of the major polling companies, who between them published over 200 polls in the campaign period.
While there is evidently variation between the different companies, they were consistent in overestimating Labour's share and underestimating that of the Conservatives.
Chart 1: Vote shares predicted by each polling company
The bars represent the range of vote shares (in percent) for Labour and the Conservatives predicted by each polling company during the campaign period, with the average (median) represented as the white point within this bar.
In Scotland, most pollsters had the SNP winning almost all seats, which turned out to be entirely accurate. The clearer trends in Scotland since the independence referendum in 2014 may help to explain why the same companies using the same methods had far greater success north of the border.
Where did it go wrong?
The purpose of an opinion poll is to offer a snapshot of what the population thinks at any given time. It aims to do this through a sample (typically around 1,000 people) that, by design, should be representative of the entire population. Consequently, there could be two broad sources of error for the 2015 polls:
- Voters changed their minds in large numbers on election day
- There was systematic bias in sampling and/or weighting of the data to ensure representativeness, and account for likelihood of voting and honesty of respondents
A poll by Lord Ashcroft suggested that only 11% of voters made up their mind on election day, pointing towards sampling bias as a more likely cause of error. But drilling down to the exact source of this bias is likely to take some time.
The British Polling Council, the association that counts most major pollsters as its members, has announced an independent inquiry into the possible causes of this "apparent bias".
The errors of 1992 resulted in a substantial rethink of how to conduct polls of peoples' voting intentions: it seems likely that those of 2015 will have a similar impact on the industry.
Chart 2: vote share predicted in the year leading up to the Election
The polls predicted the UKIP, SNP and Green party shares fairly accurately, but they were largely wrong about the balance of support between the Conservatives and Labour: vote share predicted by the polls in the year leading up to the 2015 General Election.