We now have results from every constituency but Thirsk and Malton, where the election was delayed because of the death of a candidate. The final results in Great Britain are CON 37%, LAB 30%, LDEM 24%, Others 10%. Seats are Conservatives 306, Labour 258, Liberal Democrats 57, Others 28. The MORI/NOP exit poll, despite initial scepticism when it showed the almost total disappearence of the Lib Dem surge, turned out to be pretty much spot on. However, this means the final polls tended to call the Lib Dems wrongly.
(UPDATE – This table includes only those companies who polled in the final 48 hours before the election – genuine eve of election polls. Three companies polled during the campaign, but did not produce eve-of-election polls. RNB, BPIX and OnePoll all carried out their final polls on or over the final weekend of the campaign, meaning that strictly speaking they cannot be compared to the final result. For the record however RNB did very well indeed with figures of CON 37%, LAB 28%, LDEM 26%, BPIX had CON 34%, LAB 27%, LDEM 30%, OnePoll produced a horrific CON 30%, LAB 21%, LDEM 32%. These three pollsters were also the only ones in the campaign not to fully disclose methodology and data tables, so we can draw little in the way of conclusions on what they got wrong or right)
ICM was the closest to the final result, and was within 2 points for every party. Almost all the pollsters were within the margin of error for the Conservatives and Labour (the exceptions being TNS and Angus Reid) and for the first time in decades pollsters were underestimating Labour support! The average error of all but two of the pollsters was within the margin of error. However, this disguises the issue of the Liberal Democrats: every single pollster over estimated their support, by between 2 points and 5 points. Something was wrong here (interestingly enough, the closest poll of all was the penultimate YouGov poll that showed the Lib Dems down at 24, which at the time looked like a rogue to me when Wednesday’s poll showed them bouncing straight back. Perhaps it was indicating something after all).
Before the election most of the comments here expressing scepticism about the polls were people saying they were underestimating the Conservatives (on average they did slightly, but not by much. ComRes and Populus got the level of Tory support spot on), or that the polls couldn’t cope with the huge surge of new support for the Liberal Democrats from new voters and were underestimating it. Reality turned out to be the opposite – the Lib Dem surge was an illusion, that vanished when people arrived at the ballot box. We’ll get a better idea over the next few weeks as pollsters look at their data and recontact people they interviewed before the election to see how they actually voted – the basic question though will be whether the Lib Dem boost was a genuine surge of support that reversed at the last minute – after all, a lot of respondents were saying they might change their mind – or whether it was never really there to begin with and the pre-election polls were wrong.
Ben Page and Martin Boon have both already commented to Research Live – Ben says “On our final poll for the Evening Standard on Wednesday, we had 40% of Lib Dems saying they might change their mind. We’ll all want to look and see what we can do about soft support for the Lib Dems, we’ll have to find a rational and reasonable way of dealing with it rather than just saying Lib Dems tend to overstate. We will all be looking at certainty of vote, voting history – the surge was partly younger people – and late switching, things like that. The Lib Dems were most likely to say they would vote tactically. So the support was there but it didn’t actually manifest itself in votes on the day – Lib Dem support was slowly deflating after initial Clegstacy and on the day fell further.”
Martin said “There are some sizeable average errors out there and we all do need to take a look at our methods. Clearly all polling companies have overstated the Lib Dems, so there has to be something consistent going on. It would be a little bit premature to consider the reasons for this but it’s up to the opinion pollsters to see why it might have been the case. We’re always testing our methods and this is the best time to be looking at methodologies, assumptions and techniques in order to improve them in the future.”
The first part of the post mortem really needs to be for pollsters to re-contact people they interviewed in their final polls – did people who said they’d vote Lib Dem change their minds at the last minute (in which case it’s late swing), or did they not vote at all (in which case, perhaps pollsters need to work on more sensitive methods of predicting likelihood to vote) or will they claim they did go ahead and vote Lib Dem, meaning there was a sampling problem – if so perhaps it’s down to the Lib Dem support being the least well correlated with past vote or party ID. Did don’t knows split disproportionately against the Liberal Democrats? One possiblity that strikes me is whether it could essentially be the opposite to the spiral of silence, a spiral of enthusiasm perhaps! Political pollsters are used to worrying about people being embarrassed to admit voting for unpopular parties, and have come up with ways of dealing with it, but having it suddenly become hugely fashionable to support a party is a new problem. At least it’s one that is unlikely to re-occur too often ;)
Once I get a nice spreadsheet of results I’ll also be interesting to see how accurate the marginals polling was too. Right now, however, I’m going to catch up on some sleep. I expect there will be some polls in the Sunday papers asking exactly who the public think should be the new Prime Minister – so until then…