
In Defence of Polling
2015 is unlikely to be remembered as a high point in opinion polling.
In the months since the election I’ve spoken at various events and appeared on various panels, and at almost every one at some point there’s been a question from the audience along the lines of “Why should I ever believe what a poll says ever again?”. It’s normally from a middle aged man, who looks very pleased with himself afterwards and folds his arms with his hands sort of tucked in his armpits. It may be the same man, stalking me. I fear he may be out in my front garden right now, waiting to pounce upon me with his question when I go to take the bins out.
Anyway, this is my answer.
Following the general election we pollsters took a lot of criticism and that’s fair enough. If you get things wrong, you get criticised. The commentariat all spent months writing about hung Parliaments and SNP-Labour deals and so on, and they did it because the polls were wrong. The one bit of criticism that I recall particularly grated with me though was a tweet (I can’t find it now, but I think it was from Michael Crick) saying that journalists should have talked to more people, rather than looking at polls.
And you know what, I thought to myself, maybe if you had talked to more people, maybe if you really ramped it up and talked to not just a handful of people in vox pops, but to thousands of people a day. Maybe then you could have been as wrong as we were, because that’s exactly what we were doing.
Polling is seen as being all about prediction – however often we echo Bob Worcester’s old maxim of a poll being a snapshot, the public and the media treat them as predictors, and we pollsters as some sort of modern-day augur. It isn’t, polling isn’t about prediction, it’s about measurement. Obviously there is a skill in interpreting what you have measured, and in measuring the right things in the first place, but ultimately the irreducible core of a poll is just asking people questions, and doing it in a controlled, quantifiable, representative way.
Polling about voting intention relies on a simple belief that the best way to find out how people are going to vote at a general election is to actually go and ask those people how they will vote at the general election. In that sense we are at one with whoever it was who wrote that tweet. You want to know how people will vote? Then talk to them.
The difference is if you want make that meaningful in any sense, you need to do it in an organised and sensible fashion. There is no point talking to 1000 people at, say, the Durham Miner’s Gala, or at Henley Regetta. There is no point only talking to people you can find within five minutes of the news studio. You are not going to get a proper picture if everyone you talk to is under 40 and white, or the sort of people walking round a shopping centre on a mid-week afternoon.
If you want to actually predict a general election based on talking to people, you’re going to have to make sure that the thousand people you talk to are properly reflective of all the people in Britain, that you’ve got the right number of people of different ages, genders, races, incomes, from every part of the country. And if you find out that despite your best efforts you have too many men and too few women, or too few young people and too few old people, you need to put it right by giving more weight to the answers from the women or young people you do have.
And that’s it. Ultimately all pollsters do is ask people questions, and try and do it in the fairest, most controlled and representative way we can. Anyone saying all polls are wrong or they’ll never believe a poll again is essentially saying it is impossible to find out how people will vote by asking them. It may be harder than you’d think, it may face challenges from declining response rates, but there’s no obviously better way of finding the information out.
No pollster worth his or salt has ever claimed that polls are infallible or perfect. Self evidently they weren’t, as they’d already got it wrong in 1970 and 1992. People can and do change their minds between polls being conducted and the election (sometimes even between the final polls and the election). Lots of people don’t know how they’ll vote yet. Some people do lie, sometimes people can’t answer a question because they don’t really know themselves, or don’t really have an opinion and are just trying to be helpful.
Pollsters know the limitations of what we do. We know a poll years out can’t predict an election. We know there are things that you really can’t find out by asking people straight (don’t get me started on “will policy X make you more likely to vote for Y?”). Our day to day job is often telling people the limitations of our product, of saying what we can’t do, what we can’t tell. If anything, the pendulum had swung too far before the election – some people did put too much faith in polls when we know they are less than perfect. I obviously hope the industry will sort out and solve the problems of this May (more on that here, for those who missed it). It’s a good thing that in 2020 journalists and pundits will caveat their analysis of the polling and the election with a “if the polls are right” and mean it, rather than just assuming they will be. For all its limitations though, opinion polling – that is, asking people what they think in a controlled and representative way – is probably the best we have, and it’s a useful thing to have.
Public opinion matters because we are a democracy, because people’s opinions drive how they vote and how they vote determines who governs us. Because it matters, it’s worth measuring.
What would it be without polling? Well, I’m not going to pretend the world would come to a shuddering halt. There are people who would like to see less polling because they think it would lead to a better press, better political reporting. The argument goes that political reporting concentrates too much on the horse race and the polling figures and not enough on policies. I don’t think a lack of polls would change that, if anything it would give more room for speculation. The recent Oldham by-election gave us an example of what our elections would be without polls: still a focus on who was going to win and what the outcome might be, except informed only by what campaign insiders were saying, what people picked up on the doorstep and what “private polls” were supposedly saying. Dare I whisper it after all the opprobrium, but perhaps if Lord Ashcroft had done one of his constituency polls early on and found UKIP were, in fact, a country mile behind Labour the reporting of the whole by-election might have been a tad better.
There was a time (up to the point that the Times commissioned YouGov to do a public poll) in the Labour leadership election when it looked like it might be the same, that journalists would be reporting the campaign solely upon what campaigns claimed their figures were showing, on constituency nomination figures and on a couple of private polls that Stephen Bush had glimpsed. While it may have been amusing if the commentariat had covered the race as if it was between Burnham and Cooper, only to find Corbyn a shock winner, I doubt it would have served democracy or the Labour party well. Knowing Corbyn could win meant he was scrutinised rather than being treated as the traditional left-wing also-ran, meant Labour members could cast their votes in the knowledge of the effect it might have, and vote tactically for or against a candidate if they wished.
And these are just election polls. To some extent there are other ways of measuring party support, like claimed canvassing returns, or models based upon local by-election results or betting markets. What about when it comes to other issues – should we legalise euthanasia? Should we bomb Syria? I might make my living measuring public opinion, but for what it’s worth I’m a grumpy old Burkean who is quite happy for politicians to ignore public opinion and employ their own good judgement. However, for anyone who thinks politicians should reflect public opinion in their actions, they need to have the tools to do so. Some people (and some politicians themselves) do think politicians should reflect what their voters want, and that means they need some half decent way of measuring it. That is, unless you’d like them to measure it by who can fill in the most prewritten “write to your MP” letters, turn out the largest number of usual suspects on a march, or manipulate the most clicks on a voodoo poll.
In 2016 we will have the results of the BPC inquiry, we’ll see what different methods the pollsters adopt to address whatever problems are identified and we’ll have at least three elections (London, Scotland and Wales) and possibly a European referendum to see if they actually work. Technically they won’t actually tell us if the polls have solved their problems or not (the polling in Scotland and Wales in May was actually fine anyway, and referendum polling presents its own unique problems), but we will be judged upon them nevertheless. We shall see how it pans out. In the meantime, despite the many difficulties there are in getting a representative sample of the British public, I still think those difficulties are surmountable, and that ultimately, it’s still worth trying to find out and quantify what the public think.