In Defence of Polling

2015 is unlikely to be remembered as a high point in opinion polling.

In the months since the election I’ve spoken at various events and appeared on various panels, and at almost every one at some point there’s been a question from the audience along the lines of “Why should I ever believe what a poll says ever again?”. It’s normally from a middle aged man, who looks very pleased with himself afterwards and folds his arms with his hands sort of tucked in his armpits. It may be the same man, stalking me. I fear he may be out in my front garden right now, waiting to pounce upon me with his question when I go to take the bins out.

Anyway, this is my answer.

Following the general election we pollsters took a lot of criticism and that’s fair enough. If you get things wrong, you get criticised. The commentariat all spent months writing about hung Parliaments and SNP-Labour deals and so on, and they did it because the polls were wrong. The one bit of criticism that I recall particularly grated with me though was a tweet (I can’t find it now, but I think it was from Michael Crick) saying that journalists should have talked to more people, rather than looking at polls.

And you know what, I thought to myself, maybe if you had talked to more people, maybe if you really ramped it up and talked to not just a handful of people in vox pops, but to thousands of people a day. Maybe then you could have been as wrong as we were, because that’s exactly what we were doing.

Polling is seen as being all about prediction – however often we echo Bob Worcester’s old maxim of a poll being a snapshot, the public and the media treat them as predictors, and we pollsters as some sort of modern-day augur. It isn’t, polling isn’t about prediction, it’s about measurement. Obviously there is a skill in interpreting what you have measured, and in measuring the right things in the first place, but ultimately the irreducible core of a poll is just asking people questions, and doing it in a controlled, quantifiable, representative way.

Polling about voting intention relies on a simple belief that the best way to find out how people are going to vote at a general election is to actually go and ask those people how they will vote at the general election. In that sense we are at one with whoever it was who wrote that tweet. You want to know how people will vote? Then talk to them.

The difference is if you want make that meaningful in any sense, you need to do it in an organised and sensible fashion. There is no point talking to 1000 people at, say, the Durham Miner’s Gala, or at Henley Regetta. There is no point only talking to people you can find within five minutes of the news studio. You are not going to get a proper picture if everyone you talk to is under 40 and white, or the sort of people walking round a shopping centre on a mid-week afternoon.

If you want to actually predict a general election based on talking to people, you’re going to have to make sure that the thousand people you talk to are properly reflective of all the people in Britain, that you’ve got the right number of people of different ages, genders, races, incomes, from every part of the country. And if you find out that despite your best efforts you have too many men and too few women, or too few young people and too few old people, you need to put it right by giving more weight to the answers from the women or young people you do have.

And that’s it. Ultimately all pollsters do is ask people questions, and try and do it in the fairest, most controlled and representative way we can. Anyone saying all polls are wrong or they’ll never believe a poll again is essentially saying it is impossible to find out how people will vote by asking them. It may be harder than you’d think, it may face challenges from declining response rates, but there’s no obviously better way of finding the information out.

No pollster worth his or salt has ever claimed that polls are infallible or perfect. Self evidently they weren’t, as they’d already got it wrong in 1970 and 1992. People can and do change their minds between polls being conducted and the election (sometimes even between the final polls and the election). Lots of people don’t know how they’ll vote yet. Some people do lie, sometimes people can’t answer a question because they don’t really know themselves, or don’t really have an opinion and are just trying to be helpful.

Pollsters know the limitations of what we do. We know a poll years out can’t predict an election. We know there are things that you really can’t find out by asking people straight (don’t get me started on “will policy X make you more likely to vote for Y?”). Our day to day job is often telling people the limitations of our product, of saying what we can’t do, what we can’t tell. If anything, the pendulum had swung too far before the election – some people did put too much faith in polls when we know they are less than perfect. I obviously hope the industry will sort out and solve the problems of this May (more on that here, for those who missed it). It’s a good thing that in 2020 journalists and pundits will caveat their analysis of the polling and the election with a “if the polls are right” and mean it, rather than just assuming they will be. For all its limitations though, opinion polling – that is, asking people what they think in a controlled and representative way – is probably the best we have, and it’s a useful thing to have.

Public opinion matters because we are a democracy, because people’s opinions drive how they vote and how they vote determines who governs us. Because it matters, it’s worth measuring.

What would it be without polling? Well, I’m not going to pretend the world would come to a shuddering halt. There are people who would like to see less polling because they think it would lead to a better press, better political reporting. The argument goes that political reporting concentrates too much on the horse race and the polling figures and not enough on policies. I don’t think a lack of polls would change that, if anything it would give more room for speculation. The recent Oldham by-election gave us an example of what our elections would be without polls: still a focus on who was going to win and what the outcome might be, except informed only by what campaign insiders were saying, what people picked up on the doorstep and what “private polls” were supposedly saying. Dare I whisper it after all the opprobrium, but perhaps if Lord Ashcroft had done one of his constituency polls early on and found UKIP were, in fact, a country mile behind Labour the reporting of the whole by-election might have been a tad better.

There was a time (up to the point that the Times commissioned YouGov to do a public poll) in the Labour leadership election when it looked like it might be the same, that journalists would be reporting the campaign solely upon what campaigns claimed their figures were showing, on constituency nomination figures and on a couple of private polls that Stephen Bush had glimpsed. While it may have been amusing if the commentariat had covered the race as if it was between Burnham and Cooper, only to find Corbyn a shock winner, I doubt it would have served democracy or the Labour party well. Knowing Corbyn could win meant he was scrutinised rather than being treated as the traditional left-wing also-ran, meant Labour members could cast their votes in the knowledge of the effect it might have, and vote tactically for or against a candidate if they wished.

And these are just election polls. To some extent there are other ways of measuring party support, like claimed canvassing returns, or models based upon local by-election results or betting markets. What about when it comes to other issues – should we legalise euthanasia? Should we bomb Syria? I might make my living measuring public opinion, but for what it’s worth I’m a grumpy old Burkean who is quite happy for politicians to ignore public opinion and employ their own good judgement. However, for anyone who thinks politicians should reflect public opinion in their actions, they need to have the tools to do so. Some people (and some politicians themselves) do think politicians should reflect what their voters want, and that means they need some half decent way of measuring it. That is, unless you’d like them to measure it by who can fill in the most prewritten “write to your MP” letters, turn out the largest number of usual suspects on a march, or manipulate the most clicks on a voodoo poll.

In 2016 we will have the results of the BPC inquiry, we’ll see what different methods the pollsters adopt to address whatever problems are identified and we’ll have at least three elections (London, Scotland and Wales) and possibly a European referendum to see if they actually work. Technically they won’t actually tell us if the polls have solved their problems or not (the polling in Scotland and Wales in May was actually fine anyway, and referendum polling presents its own unique problems), but we will be judged upon them nevertheless. We shall see how it pans out. In the meantime, despite the many difficulties there are in getting a representative sample of the British public, I still think those difficulties are surmountable, and that ultimately, it’s still worth trying to find out and quantify what the public think.


ICM released their final monthly voting intention poll of 2015 yesterday, with topline figures of CON 39%, LAB 34%, LDEM 7%, UKIP 10%, GRN 3%. I assume it’s the last voting intention poll we will see before Christmas. The full tables are here, where ICM also make an intriguing comment on methodology. They write,

For our part, it is clear that phone polls steadfastly continue to collect too many Labour voters in the raw sample, and the challenge for phone polling is to find a way to overcome the systematic reasons for doing so. The methodological tweaks that we have introduced since the election in part help mitigate this phenomenon by proxy, but have not overcome the core challenge. In our view, attempting to fully solve sampling bias via post-survey adjustment methods is a step too far and lures the unsuspecting pollster into (further) blase confidence. We will have more to say on our methods in the coming months.


-->

Opinium have a new poll in today’s Observer – topline figures are CON 38%, LAB 30%, LDEM 5%, UKIP 16%, GRN 5%. Tabs are here. The rest of the poll largely concentrated on leadership questions. Cameron’s approval rating stands at minus 6, Corbyn at minus 25, Farage minus 18, Farron minus 22 (though over half of respondents said don’t know on Farron). Net favourable vs unfavourable ratings were similar to job approval – Cameron -5, Corbyn -28, Farage -21, Farron -19.

Asked about the specific qualities of the leaders David Cameron’s strongest ratings were on being decisive (+5), having the nation’s interests at heart (+3), being a strong leader (+8), getting things done (+11) and standing up for Britain abroad (+4). His biggest weakness, as you will almost certainly have guessed, was being in touch with ordinary people (-34). After five years as Prime Minister, a decade as Tory leader, we know how Cameron is perceived by the public: an effective national leader, but posh and out of touch.

Asked to rate Jeremy Corbyn on the same measures his top ratings come on sticking to his principles (+32) and being in touch with ordinary people (-2). His ratings elsewhere are negative, particularly on being a strong leader, getting things done and standing up for Britain abroad (though the last two are a little unfortunately worded – one could have answered them in the context of Corbyn not being able to get things done because he’s not in government).

Best Prime Minister David Cameron leads by 41% to Corbyn’s 20%. With Cameron stepping down before the general election this match up is never going to happen though – when Opinium asked the same question with David Cameron’s potential successors the figures were far closer: 27% Osborne, 24% Corbyn; May 29%, Corbyn 23%; Boris 34%, Corbyn 23%. The Tory party don’t love David Cameron, but electorally they may miss him when he’s gone.

Earlier in the week there was also the monthly ComRes telephone poll for the Daily Mail. Topline figures there were CON 37%, LAB 33%, LDEM 7%, UKIP 11%, GRN 5%. These are good figures for Labour by the standards of ComRes, who since introducing their new socio-economic turnout model have shown the largest Conservative leads, typically around eleven points. Of course, it is just one poll, so all the usual caveats apply… it may herald a narrowing of the polls, or may just be random sample variation and go back to more typical figures next month. Full tabs are here.


Yesterday there were two EU referendum polls showing the race essentially neck-and-neck. Today there are two more EU referendum polls, but both of these have REMAIN with a solid looking twenty-plus point lead. ComRes for OpenEurope have topline EU voting intention figures of REMAIN 56%, LEAVE 35% (tabs are here). Ipsos MORI for the Standard have topline figures of REMAIN 58%, LEAVE 32% (full tabs are here)

Note that MORI asked the referendum question as a split-sample. Half the sample were asked how they would vote in a referendum, stay in or get out (MORI’s long term tracker question), the other half were asked the actual referendum question. The stay in or get out question had a split of 53%-36%, the actual referendum question question produced a bigger lead for staying in 58%-32%. Wherever possible, I am using questions that use the actual referendum wording, so those are the figures that have gone in my EU referendum tracking data here.

The difference between EU referendum voting intentions appears to be a gap between online polling and telephone polling. It’s always difficult to be certain of course – there are many differences between different companies’ approaches and there haven’t been that many telephone polls – but the phone polls from ComRes and MORI are averaging around REMAIN 55%, LEAVE 35%, DON’T KNOW 10%, the online polls from ICM, YouGov, ComRes and Survation are averaging around REMAIN 43%, LEAVE 40%, DON’T KNOW 18%. The telephone polls have “remain” substantially higher and, intriguingly, “don’t know” substantially lower. As ever, it’s difficult to be confident what the reasons are – it could be a difference in sampling (if for some reason online or telephone samples reach respondents who are substantially more or less pro-European) or it could be an interviewer effect (if people are less willing to tell a human interviewer they would vote to leave or they haven’t yet decided).

Meanwhile the monthly MORI voting intention figures were CON 38%, LAB 31%, LDEM 9%, UKIP 9%.


A quick update on EU referendum polling. The regular weekly ICM poll today has topline figures of REMAIN 42%, LEAVE 41%. This is closer than ICM have been showing of late – typically they’ve had REMAIN with a lead of around six points – but as ever, it’s nothing that could not be explained by normal sample variation. Wait to see if their poll next week backs it up (tabs are here).

The second poll is by Survation for the Europe of Freedom and Direct Democracy group (i.e. UKIP’s group in the European Parliament). While this is newly released, the fieldwork was actually conducted a fortnight ago (30th Nov – 3rd Dec). Topline figures there are REMAIN 40%, LEAVE 42%. After their previous poll showed Remain ahead, Survation are back to showing Leave in the lead. (Full tabs are here).