Next Friday is the public meeting of the British Polling Council inquiry into the failure of the polls at the 2015 election, at which point I expect we’ll get some insight into what the different polling companies are thinking, though probably not many firm conclusions yet.

In the meantime the British Election Study team have published some thoughts from Jon Mellon about what the BES data could tell us about why the polls were wrong. It doesn’t include any conclusions yet, but goes through a lot of the thought processes and ways of identifying what went wrong, which I suspect may reflect what many of the pollsters are doing behind closed doors.

As yet only the online BES data from during the campaign is available for download, but in time it will be joined by their online recontact survey after the election campaign, their face-to-face survey after the campaign and voter validation data for the people interviewed in the face-to-face survey. The article has some thoughts about what they can learn from the data that’s already available and what can be learnt from the bits that are still to come:

1) The BES campaign data appears to show some movement towards the Tories over the last couple of days, though not one that is beyond the margin of error. This is in contrast with YouGov’s daily polling data, despite them coming from the same panel. This is interesting, but as Jon says, the real proof will be when the BES publish their post-election data, showing if people actually did change their minds from their pre-election answers

2) If you only take people who said they were very likely to vote it would have been more Tory… but that’s very much a “Pope is Catholic” finding. The interesting bit here is what the BES team plan on doing in the future – they are once again going to validate their face-to-face data against the marked electoral register, to see if people who claim they voted genuinely did, and how well people’s stated intention to vote compares to whether they actually did. They are also going to match the online respondents to the electoral registers before and after the new electoral registers, to see if drop off from individual electoral registration was a factor.

3) Sampling and weighting. Jon hasn’t really said anything on the data so far – he’s waiting for the face-to-face probability sample, to compare that to the results from the online polling and see if it is significantly closer to the actual result.

4) Don’t knows. According to Jon the people who said don’t know before the election were a mixed bunch – their attitudes towards the leaders, issues and party id did not point to them being obviously likely to switch to Conservative or Labour. Again, the interesting bit will be to see how they said they ended up voting in the post-election wave.

5) “Shy tories”. Jon makes two interesting points. One is about question order. While the BES campaign data came from YouGov’s panel, its results seemed to show a movement towards the Tories that the main YouGov data didn’t show – in his article Jon presents Peter Kellner’s hypothesis that this may be because of question order. As regular readers will know, the published voting intention polls all religiously ask voting intention first, but the BES actually asks some questions about the most important issues facing the country and party leaders before asking VI. However, Jon also mentions what he judges to be “weak” evidence against “shy Tory” hypothesis – the BES included a grid of questions aimed at identifying people who tended to give socially desirable answers to questions, and Conservatives scored higher, not lower, amongst those people.


151 Responses to “The British Election Study on why the polls were wrong”

1 2 3 4
  1. I agree with the comment from somebody who asked that a summary of what is said at Friday’s meeting of the British Election Study should be published, along with any handouts, should be posted on this site. It may be that the BES will wish to post such material on their own site, in which case I hope that links can be provided from this site to theirs

    I am not sure that Jon Mellon’s thoughts are the only ones, or even the most important ones, relevant to why the polls may have gone wrong, although they are certainly apposite and interesting. One point I would add is that British opinion polls have been conducted for many years now on what have become traditional lines. I think, however, that there is now a need to start with as nearly as possible as “blank sheet” to identify which factors account for most of the variance when it comes to predicting voting behavior.

1 2 3 4