On Tuesday the BPC/MRS’s inquiry into why the polls went wrong publishes its first findings. Here’s what you need to know in advance.

The main thing the inquiry is looking at is why the polls were wrong. There are, essentially, three broad categories of problems that could have happened. First, there could have been a late swing – the polls could actually have been perfectly accurate at the time, but people changed their minds. Secondly respondents could have given inaccurate answers – people could have said they’d vote and not done so, said they’d vote Labour, but actually voted Tory and so on. Thirdly the samples themselves could have been wrong – people responding to polls were honest and didn’t change their minds, but the pollsters were interviewing the wrong mix of people to begin with.

Some potential problems can straddle those groups. For example, polls could be wrong because of turnout, but that could be because pollsters incorrectly identified which people would vote or because polls interviewed people who are too likely to vote (or a combination of the two). You end up with the same result, but the root causes are different and the solutions would be different.

Last year the BPC held a meeting at which the pollsters gave their initial thoughts on what went wrong. I wrote about it here, and the actual presentations from the pollsters are online here. Since then YouGov have also published a report (writeup, report), the BES team have published their thoughts based on the BES data (write up, report) and last week John Curtice also published his thoughts.

The most common theme through all these reports so far is that sampling is to blame. Late swing has been dismissed as a major cause by most of those who’ve looked at the data. Respondents giving inaccurate answers doesn’t look like it will be major factor in terms of who people will vote for (it’s hard to prove anyway, unless people suddenly start be honest after the event, but what evidence there is doesn’t seem to back it up), but could potentially be a contributory factor in how well people reported if they would vote. The major factor though looks likely to be sampling – pollsters interviewing people who are too easy to reach, too interested in politics and engaged with the political process and – consequently – getting the differential turnout between young and old wrong.

Because of the very different approaches pollsters use I doubt the inquiry will be overly prescriptive in terms of recommended solutions. I doubt they’ll say pollsters should all use one method, and the solutions for online polls may not be the same as the solutions for telephone polls. Assuming the report comes down to something around the polls getting it wrong because they had samples made up of people who were too easily contactable and too politically engaged and likely to vote, I see two broad approaches to getting it right. One is to change the sampling and weighting in way that gets more unengaged people, perhaps ringing people back more in phone polls, or putting some measure of political attention or engagement in sampling and weighting schemes. The other is to use post-collection filters, weights or models to get to a more realistic pattern of turnout. We shall see what the inquiry comes up with as the cause, and how far they go in recommending specific solutions.

While the central plank of the inquiry will presumably what went wrong, there were other tasks within the inquiry’s terms of reference. They were also asked to look at the issue of “herding” – that is, pollsters artificially producing figures that are too close to one another. Too some degree a certain amount of convergence is natural in the run up to an election given that some of the causes of the difference between pollsters are different ways treating things like don’t knows. As the public make their minds up, these will cause less of a difference (e.g. if one difference between two pollsters is how they deal with don’t knows, it will make more of a difference when 20% of people say don’t know than when 10% do). I think there may also be a certain sort of ratchet effect – pollsters are only human, and perhaps we scrutinise our methods more if we’re showing something different from anyone else. The question for the inquiry is if there was anything more than that? Any deliberate fingers on the scales to make their polls match?

Finally the inquiry have been asked about how polls were communicated to the commentariat and the public, what sort of information is provided and guidance given to how they should be understood and reported. Depending on what the inquiry find and recommend, in terms of how polls are released and reported in the future this area could actually be quite important. Again, we shall see what they come up with.

Survation had a new poll of Scottish voting intentions in the Holyrood election this week. As usual in the present Scottish political scene they show a towering SNP lead, with Labour second and the Conservatives in third. Constituency voting intentions are SNP 52%, LAB 21%, CON 16%, LDEM 7%; Regional list intentions are SNP 42%, LAB 20%, CON 16%, GRN 9%, LDEM 8%, UKIP 5%. Tabs are here.

Meanwhile the weekly ICM EU referendum tracker has started up again after the Christmas break. Their final poll of 2015 had been an unusual 50-50 split but the latest poll has reverted to the norm – REMAIN 44%, LEAVE 38% (the equivalent, after don’t knows are excluded) of REMAIN 54%, LEAVE 46%. Full tabs are here.


Support or opposition to strike action is often largely influenced by people’s attitudes to the people going on strike and the inconvenience it causes them. If it’s a profession that people admire and think is generally hard done by they’ll sympathise, if it’s a profession that people don’t think much of they won’t. If the inconvenience it causes people is relatively minor, people will understand; if it really puts out large numbers of people, like school or tube closures, then sympathy is less forthcoming. The specific ins-and-outs of the dispute are often impenetrable or irrelevant. It’s who we trust, who is the good guy.

The public hold doctors in extremely high regard and unless they happen to have had a hospital appointment today it’s unlikely to cause most people any direct noticable inconvenience, so you’d expect fairly high support. That’s what the polls show. Ipsos MORI had a new poll for yesterday’s Newsnight which found the public supported strike action emphatically (66% to 16%) when junior doctors would still provide emergency care, and much more narrowly (44% to 39%) if junior doctors would not provide emergency care either. Full tabs are here.

Late last year before the intitial round of strikes were postponed YouGov found a similar pattern – people clearly supported strike action by 51% to 32% when junior doctors would still cover emergency treatment, when strike action would also cover emergency care people were more evenly divided (45% to 37%). Tabs are here.

At present this breaks the way you would expect in an argument between politicians on one side, and trustworthy and overworked people who come to your rescue when you’re ill on the other. If strike action that also involves emergency care goes ahead though public opinion may become more finely balanced.

I’m just catching up on the YouGov London poll earlier in the week for LBC – full tabs are here. Last May Labour enjoyed a solid swing in their favour in London and ended up nine points ahead of the Tories, they’ve largely maintained that support – YouGov’s London voting intention figures with changes from the general election are CON 37%(+2), LAB 44%(nc), LDEM 4%(-4), UKIP 11%(+3), GRN 2%(-3).

London mayoral voting intentions are KHAN 45%, GOLDSMITH 35%, WHITTLE 6%, BERRY 5%, PIDGEON 4%, GALLOWAY 2%. Sadiq Khan’s lead over Zac Goldsmith is slightly larger than the Labour lead, but not by very much. There are very few Tories saying they’d vote Khan or Labour voters saying they’d vote Goldsmith – essentially it looks like an electorate splitting along their normal partisan loyalties and in a city that tends to vote Labour that’s a good sign for Sadiq Khan.

In the last two mayoral elections Boris Johnson managed to reach out beyond the usual Conservative vote, but he is a rather unique politician and it remains to be seen if Zac Goldsmith can do the same. It may be that current polls are just picking up people’s default partisan loyalties, and that as we get closer to the election people people’s votes will become more influenced by their attitudes towards Goldsmith and Khan. If they don’t, Khan will have an obvious advantage in a city where Labour romped home in 2015 and where the direction of political movement is towards Labour.

Over the New Year the Times had an end of year YouGov poll, conducted in mid-December. The tables went up on the YouGov website today here. Topline figures were CON 39%, LAB 29%, LDEM 6%, UKIP 17%, GRN 3%. The rest of the poll, covering a lot of the trackers that YouGov used to ask on the regular daily polls, illustrate some of the real problems facing Labour as well as a couple of opportunities.

The net doing well/doing badly figures for the party leaders are minus 6 for David Cameron, minus 13 for Tim Farron, minus 18 for Nigel Farage and minus 32 for Jeremy Corbyn. Not long into the job Corbyn already has pretty dire figures (to be fair, they are up since YouGov last asked when it was minus 41 – albeit at the time of the Syria vote). On who would make the best Prime Minister David Cameron has a solid twenty-six point lead over Corbyn, on 49% to Corbyn’s 23%.

However, on any “best PM” questions we need to keep in mind that Cameron probably won’t be there. Corbyn still trails behind the likely replacements for Cameron, but not by quite as much, head-to-head against Boris he would be fourteen points behind (Boris 43%, Corbyn 29%), head-to-head against Osborne he would be twelve points behind (Osborne 39%, Corbyn 27%).

Asked about which party they’d trust to handle the big issues of the day Labour are ahead on their reliable banker of the NHS (though by only seven points) and on housing (by five points). The two parties are essentially neck-and-neck on education (Con 28%, Lab 27%) and on immigration UKIP lead (29%, to the Tories on 24% and Labour on 15%). On law and order and on economic issues the Tories lead – on tax by 13 points, the economy in general by 23 points, on unemployment by 12 points.

Unemployment is an interesting one here. As I’ve written many times before “best party on issues” questions tend to move in tandem, if the Conservatives improve on education, they also improve on tax, on housing, on transport and so on. Each party has strong and weak issues (so Labour will always do better on the NHS, the Conservatives will always do better on crime) but a lot of the change in figures seems to actually reflect underlying perceptions of a party’s general competence, rather than their specific statements or policies on that subject. What is really interesting on these questions therefore is when a measure moves relative to other ones – over time unemployment appears to have done so. If you go back to old Gallup or MORI questions from the 1980s and 90s, under Thatcher and Major unemployment became an issued “owned” by the Labour party, up there with the NHS. Whatever their other failings, people trusted Labour with the issue of unemployment. A decade ago YouGov were giving the Labour party a lead of 13 points on the NHS, and 25 points on unemployment. Now it’s switched over, the NHS is still a safe issue for Labour, but unemployment is an issue where people trust the Conservatives.

The other economic questions in the survey were also relatively optimistic (or in the case of personal economic expectations, not as pessimistic as in the past). By 35% to 26% people now think the economy is in a good state, and 21% of people expect to be financially better off in the coming year, compared to 25% who expect to be worse off.

Improving economic expectations are a two edged sword for the government of course. At one level, George Osborne still has a lot of cuts to make to hit his targets, and if people think the economic problems of the country are solved it will be harder for him to sell them to the public. Equally, as perceptions of the economy improve people stop worrying about it. On the question of which issues are most important to the country the economy was practically nailed on to the top spot for years after the financial crisis began in 2007-2008. For a while seventy to eighty percent of people regularly named it as a major issue facing the country. Then, as the economy improved, it started to fall. By 2014 it began to dip below immigration, now it’s down in third place behind health, with just a third naming it as an important issue. Despite the Conservative party’s current strong position in voting intention, UKIP and Labour are the parties people trust the most on what they see as the two big issues facing the country, immigration and health. Then again, perhaps that just illustrates that it’s not really issues that drive voting intentions.

Finally YouGov are still finding a very close EU referendum race – 41% would vote to stay, 42% would vote to leave.