ORB have a new poll out tonight for the Independent showing a ten point lead for leave: REMAIN 45%(-4), LEAVE 55%(+4). Changes are since their last comparable poll, all the way back in April. Unlike the weekly ORB telephone polls for the Telegraph, their more infrequent polls for the Indy are done online – hence the results that are far more pro-Brexit than their poll in the week. Even accounting for that, it shows a shift towards leave that we’ve seen in many recent polls.

The ten point lead is large, but as ever, it is only one poll. Don’t read too much into it unless we see it echoed in other polling. As things stand most other online polls are still tending to show a relatively close race between Remain and Leave.

Also out today was a statement on some methodology changes from Ipsos MORI. As well as following their normal pre-election practice of filtering out people who aren’t registered to vote now the deadline for registration has passed, from their poll next week they are also going to start quotaing and weighting by education, aimed at reducing an over-representation of graduates. MORI suggest that in their last poll the change would have reduced the Remain lead by 3 or 4 points.

While they haven’t yet decided how they’ll do it, in their article they also discuss possible approaches they might take on turnout. MORI have included examples of modelling turnout based on people who say they are certain to vote and voted last time, or say the referendum is important, or who say they usually vote and so on. Exactly which one MORI end up opting for probably doesn’t make that much difference, they all have a very similar impact, reducing the Remain share by a couple of point, increasing the Leave share by a couple of points.

The combined effect of these changes is that the MORI poll in the week is going to be better for Leave due to methodological reasons anyway. If it does show another shift towards Leave, take care to work out how much of that is because of the methodology change and how much of it is due to actual movement before getting too excited/distraught.


Opinium have a new EU referendum poll in the Observer. The topline figures are REMAIN 43%, LEAVE 41%, Don’t know 14%… if you get the data from Opinium’s own site (the full tabs are here). If you read the reports of the poll on the Observer website however the topline figures have Leave three points ahead. What gives?

I’m not quite sure how the Observer ended up reporting the poll as it did, but the Opinium website is clear. Opinium have introduced a methodology change (incorporating some attitudinal weights) but have included what the figures would have been on their old methodology to allow people to see the change in the last fortnight. So their proper headline figures show a two point lead for Remain. However the methodology change improved Remain’s relative position by five points, so the poll actually reflects a significant move to leave since their poll a fortnight ago showing a four point lead for Remain. If the method had remained unchanged we’d be talking about a move from a four point remain lead to a three point leave lead; on top of the ICM and ORB polls last week that’s starting to look as if something may be afoot.

Looking in more detail at the methodology change, Opinium have added weights by people’s attitudes towards race and whether people identify as English, British or neither. These both correlate with how people will vote in the referendum and clearly do make a difference to the result. The difficulty comes with knowing what to weight them to – while there is reliable data from the British Electoral Study face to face poll, race in particular is an area where there is almost certain to be an interviewer effect (i.e. if there is a difference between answers in an online poll and a poll with an interviewer, you can’t be at all confident how much of the difference is sample and how much is interviewer efffect). That doesn’t mean you cannot or should not weight by it, most potential weights face obstacles of one sort or another, but it will be interesting to see how Opinium have dealt with the issue when they write more about it on Monday.

It also leaves us with an ever more varied picture in terms of polling. In the longer term this will be to the benefit of the industry – hopefully some polls of both modes will end up getting things about right, and other companies can learn from and adapt whatever works. Different companies will pioneer different innovations, the ones that fail will be abandoned and the ones that work copied. That said, in the shorter term it doesn’t really help us work out what the true picture is. That is, alas, the almost inevitable result of getting it wrong last year. The alternative (all the polls showing the same thing) would only be giving us false clarity, the picture would appear to be “clearer”… but that wouldn’t mean it wasn’t wrong.


-->

ComRes had a new EU telephone poll in this morning’s Daily Mail. Topline figures are REMAIN 52%(-1), LEAVE 41%(+3), Don’t know 7%(-2). Tabs are here.

Note that this poll is now adjusted for likelihood to vote, using ComRes’s turnout model based on socio-economic factors, like age and class (the changes are adjusted to reflect this). Note that adjusting turnout based on ComRes’s model has marginally increased support for Remain (before the adjustment the figures would have been 51 and 41).

There’s a broad assumption that differental turnout is more likely to favour Leave in the EU referendum campaign, largely based on the fact that polls normally show Leave voters claiming they are more likely to be 10/10 certain to vote, and that Leave voters are older. I’m not so sure. Self-reported likelihood is a blunt tool (people who say they are 10/10 certain to vote are not really much more likely than 8/10 or 9/10 people), and the age skew that should favour Leave in terms of turnout (older people are more likely to vote, and more Leave) will to some degree be cancelled out by the social class and educational skews that favour Remain (middle class people and graduates are more likely to vote, and more Remain).

On the subject of education, YouGov also had an interesting article up today. Like Populus and ICM they have carried out parallel telephone and online surveys, but unlike other such tests which have found a big gulf between phone and online results YouGov found results that were very similar to each other: both phone and online polls found a small lead for Leave.

This result wasn’t just the weighting (even before weighting the raw sample was a lot more “leave” than the raw samples from other phone polls) suggesting it is something to do with the sampling. Obviously we can’t tell for certain what the reason is – the most obvious difference is that the YouGov poll was conducted over the period of a fortnight, so was slower than most telephone polls and there was more opportunity to ring back people who were unavailable on the first call – but there could be other differences to do with quotas or the proportion of mobile calls (the YouGov poll was about a third mobile, two-thirds landline. My understanding is most phone polls are about 50/50 now, though MORI is about 20/80).

Looking at the actual demographics of the sample YouGov highlight the difference between their landline sample and the samples for the Populus paper looking at phone/online differences – specifically on education. In the Populus telephone samples between 44-46% of people had degrees, whereas the actual figure in the Census and Annual Population Survey is around 30%. The YouGov phone sample had a lower proportion of people with degrees to begin with, and weighted it to the national figure.

There is a clear correlation between education and attitudes to the EU referendum (in the YouGov polls there was a Leave lead of about 30 points among people who left school at 16 and a Remain lead of 33 points among those who were in educated beyond the age of twenty. This is partially to do with age, but it remains true even within people of the same age) so samples are too educated or not educated enough it could easily make a difference. As it is we’ve only got education data for the Populus polling – we don’t know if there’s the same skew in other phone polls, or how much of a difference it would make if corrected, but different levels of education within achieved samples is a further hypothesis that could explain that ongoing difference between phone and telephone samples for the EU referendum.


Last year the election polls got it wrong. Since then most pollsters have made only minor interim changes – ComRes, BMG and YouGov have conducted the biggest overhauls, many others have made only tweaks, and all the companies have said they are continuing to look at further potential changes in the light of the polling review. In light of that I’ve seen many people assume that until changes are complete many polls probably still overestimate Labour support. While on the face of it that makes sense, I’m not sure it’s true.

The reason the polls were wrong in 2015 seems to be the samples were wrong. That’s sometimes crudely described as samples including too many Labour voters and too few Conservative voters. This is correct in one sense, but is perhaps describing the symptom rather than the cause. The truth is, as ever, rather more complicated. Since the polls got it wrong back in 1992 almost all the pollsters have weighted their samples politically (using how people voted at the last election) to try and ensure they don’t contain too many Labour people or too few Conservative people. Up until 2015 this broadly worked.

The pre-election polls were weighted to contain the correct number of people who voted Labour in 2010 and voted Conservative in 2010. The 2015 polls accurately reflected the political make up of Britain in terms how people voted at the previous election, what it got wrong it how they voted at the forthcoming election. Logically, therefore, what the polls got wrong was not the people who stuck with the same party, but the proportions of people who changed their vote between the 2010 and 2015 elections. There were too many people who said they’d vote Labour in 2015 but didn’t in 2010, too many people who voted Tory in 2010 but said they wouldn’t in 2015, and so on.

The reason for this is up for debate. My view is that it’s due to poll samples containing people who are too interested in politics, other evidence has suggested it is people who are too easy to reach (these two explanations could easily be the same thing!). The point of this post isn’t to have that debate, it’s to ask what it tells us about how accurate the polls are now.

The day after an election how you voted at the previous election is an extremely strong predictor of how you’d vote in an election the next day. If you voted Conservative on Thursday, you’d probably do so again on Friday given the chance. Over time events happen and people change their minds and their voting intention; how you voted last time becomes a weaker and weaker predictor. You also get five years of deaths and five years of new voters entering the electorate, who may or may not vote.

Political weighting is the reason why the polls in Summer 2015 all suddenly showed solid Conservative leads when the same polls had shown the parties neck-and-neck a few months earlier, it was just the switch to weighting to May 2015 recalled vote**. In the last Parliament, polls were probably also pretty much right early in the Parliament when people’s 2010 vote correlated well with their current support, but as the Lib Dems collapsed and UKIP rose, scattering and taking support from different parties and in different proportions polls must have gradually become less accurate, ending with the faulty polls of May 2015.

What does it tell us about the polls now? Well, it means while many polling companies haven’t made huge changes since the election yet, current polls are probably pretty accurate in terms of party support, simply because it is early in the Parliament and party support does not appear to have changed vastly since the election. At this point in time, weighting samples by how people voted in 2015 will probably be enough to produce samples that are pretty representative of the British public.

Equally, it doesn’t automatically follow that we will see the Conservative party surge into a bigger lead as polling companies do make changes, though it does largely depend on the approach different pollsters take (methodology changes to sampling may not make much difference until there are changes in party support, methodology changes to turnout filters or weighting may make a more immediate change).

Hopefully it means that polls will be broadly accurate for the party political elections in May, the Scottish Parliament, Welsh Assembly and London Mayoral elections (people obviously can and do vote differently in those elections to Westminster elections, but there will be a strong correlation to how they voted just a year before). The EU referendum is more of a challenge given it doesn’t correlate so closely to general election voting and will rely upon how well pollsters’ samples represent the British electorate. As the Parliament rolls on, we will obviously have to hope that the changes the pollsters do end up making keep polls accurate all the way through.

(**The only company that doesn’t weight politically is Ipsos MORI. Quite how MORI’s polls shifted from neck-and-neck in May 2015 to Tory leads afterwards I do not know. They have made only a relatively minor methodological change in their turnout filter. Looking at the data tables, it appears to be something to do with the sampling – ICM, ComRes and MORI all sample by dialing random telephone numbers, but the raw data they get before weighting it is strikingly different. Looking at the average across the last six surveys the raw samples that ComRes and ICM get before they weight their data has an equal number of people saying they voted Labour in 2015 and saying they voted Tory in 2015. MORI’s raw data has four percent more people saying they’d voted Conservative than saying they’d voted Labour, so a much less skewed raw sample. Perhaps MORI have done something clever with their quotas or their script, but it’s clearly working.)


There are two new polls on the EU referendum out tonight – YouGov for the Times and ComRes for the Mail. YouGov have topline figures of REMAIN 37%, LEAVE 38%, DK/WNV 25%; ComRes have topline figures of REMAIN 51%, LEAVE 39%, DK 10%. ComRes was asked Friday to Monday (so started before Cameron’s deal was finalised), YouGov’s poll was asked between Sunday and Tuesday, so was after Cameron’s renegotiation, but straddled Boris Johnson’s endorsement of the Leave campaign.

As we’ve come to expect there’s a sharp difference between the online YouGov poll and the telephone ComRes poll. Online polls on the referendum have tended to show a neck-and-neck race, telephone ones have tended to show a lead for Remain. The level of support for leaving is actually pretty much the same regardless of mode – the difference all seems to be in the proportion who say stay and the proportion who say don’t know (I speculated about that a little last month, here)

Anyway, while the different modes produce different shares, just as interesting is the direction of travel. YouGov’s previous poll was conducted just after the draft renegotiation had been published and showed a significant shift towards leave, giving them a nine point lead. My suspicion then was that it could just be a short-term reflection of the extremely bad press that the deal received in the papers, and that does appear to be the case – the race has tightened right back up again. A fortnight ago YouGov found 22% thought the draft renegotiation was a good deal, 46% a bad deal. That’s now closed to 26% good, and 35% bad. After a blip from the initial bad publicity over the draft deal, the effect according to YouGov seems broadly neutral.

ComRes’s last poll found a similar trend to YouGov – it was conducted after the draft deal had been published, and found a sharp shift towards Leave, with the remain lead dropping by ten points, from eighteen to eight. Today’s poll finds that negative reaction to the draft deal fading a bit now the final deal is done, with the remain lead creeping back up to twelve points. The net effect is still negative, but not by nearly as much as the early polls suggested. ComRes’s specific question on the renegotiation provides a more positive verdict than YouGov’s – among the three-quarters of the sample asked after the deal was struck 46% say it was a success, 39% a failure.

Note that this poll also represents the first outing for some methodology changes from YouGov. Most significantly, they’ve started sampling and weighting by the attention respondents say they pay to politics, have added educational qualifications as a sampling/weighting variable and have shifted up the top age bracket from 60 and over to 65 and over. Also, at the risk of getting very technical, past vote and grouped region are now interlocked (to explain – in the past YouGov weighted everyone’s past vote to match the overall shares of the vote in Great Britain, now they are weighting respondents in London’s past vote to match the shares of the vote in London, respondents in the Midlands’ past vote to match the shares in the Midlands and so on). There isn’t actually much impact on today’s results; the old sampling and weighting would also have shown the race tightening to neck-and-neck. The main difference is that a lot of questions have a higher number of don’t knows, reflecting the higher proportion of respondents who don’t follow politics closely.

Full tables for ComRes are here, for YouGov here.