If I were TNS or Opinium I would be rather annoyed today. Looking through social media, twitter and so forth there are lots of comments about the polls all being wrong and it being a terrible night for the pollsters, etc, etc. Both TNS and Opinium had final call figures of REMAIN 49%, LEAVE 51% – within a point of the actual result. Far from being a terrible night, they got it pretty much spot on, and should be getting congratulated.
The last general election was a disaster for all the pollsters. Last night wasn’t the same at all, it was a very bad result for some pollsters, but other companies did very well. Below is a chart of the Leave lead in the final results of all the pollsters who did a poll in the last week or so
The polls in blue were conducted online, the polls in orange were conducted by telephone. Note that ORB and Survation’s fieldwork both finished a few days before the referendum, so one cannot rule out a change in support in the days between their fieldwork and the vote itself. Disappointingly for me personally YouGov’s final poll had Remain ahead, albeit, only by two points. Unlike in May 2015 though I’ve a good idea of what went wrong (the turnout model we used for the poll weighted down people who didn’t vote at the last general election, when in reality turnout ended up being higher than the last general election), which is something that can be worked on.
During much of the campaign discussion of polls focused on the gap between telephone and online polls. The division is, as ever, really not as simple as that – Populus showed the largest Remain lead and it was conducted online, until they stopped polling a few weeks from the referendum ICM’s telephone polls were showing figures as Leave as their online polls. However, the general trend was clear – online polls tended to show a closer race than telephone polls; online polls tended to show it neck-and-neck, telephone polls tended to show Remain clearly ahead.
Many media commentators bought into the view that phone polls were “better” in some way, and should carry more weight than online polls (a debate I sought to avoid as much as possible, as there really wasn’t good evidence either way). I suspect this has played into the perception that the polls as a whole were wrong. If you’ve spent the last few months focusing on the polls showing a solid leave lead, and playing down the polls showing a neck-and-neck race, then you’d have been very surprised by last night.
The gap between online and phone narrowed during the campaign, and that was largely due to changes in online polls. The debate about the gap between phone and online polls has focused largely on potential differences in sampling – studies like that of Matt Singh and Populus found that people gave different answers to questions on things like immigration and national identity in online and telephone polls, that people in online sample seemed to be less socially liberal than people in telephone samples. In response several online pollsters adopted things like attitudinal weights to make their samples more like phone polls… perhaps, in hindsight, it should have been the other way around.
Since the error in the polls in 2015 I’ve said that the problems won’t be solved overnight. Pollsters are experimenting with different methods. Some of those things will work, some will not – it is a learning process. The record of polls conducted online is getting more promising – the performance of the mostly online polls at the May elections was mostly good, and most of the online polls for the EU referendum were either good, or at least only a few points out. While the problems of 2015 are probably not entirely cured yet, online companies are showing clear progress, for some phone polls there is clearly still work to be done.