Over on the British Polling Council’s website there is a guide for journalists writing about opinion polls, written by YouGov’s Peter Kellner. The key point on it in number 13, advising journalists on what to look for in a poll – was it conducted by a reputable agency? What was the sample size? Was it properly sampled? Who was it conducted FOR? Peter writes:
In either event, watch out for loaded questions and selective findings, designed to bolster the view of the client, rather than report public opinion fully and objectively.
Now, a common criticism of polls is that they get the answers the company commissioning them want. I consistently argue against this accusation – it is the duty of a professional pollster to measure and report public opinion as it actually is, not as we would like it to be. It is our duty to craft questions that are fair and unbiased and accurately reflect public opinion.
Of course, most people could come with examples of questions that could have been better – I certainly could. I would contend these are cock-ups rather than conspiracy, simply because of pollsters’ professional dignity. A Bob Worcester, a Peter Kellner, an Andrew Cooper or a Martin Boon simply wouldn’t sign off a question they thought was biased (and, I hasten to add, neither would an Anthony Wells)
So why does the person who commissioned the poll still matter? Well, because they choose what the questions are asked about. The pollster should ensure whatever questions asked are unbiased, but it’s the client who choses what areas to ask about, and few clients commission polls they expect to damage their case. Hence, a client campaigning for tougher sentencing might commission a poll asking if people want to see longer prison sentences (since they do). A client who supports sentencing reform however might commission a poll asking if people thought prison was effective at reforming criminals (as they don’t). The broader picture is that public support long sentences, despite not thinking prison is particularly good at reforming or rehabilitating criminals. They like it for retributive reasons. However, if you saw only the questions commissioned by the imaginary pro-prison client, or only the questions commissioned by the imaginary anti-prison client, you wouldn’t know that, you’d only get one side of the story.
There are two lessons to take away. One, look at polls in the round, not in isolation. Don’t cherry pick those that tell you want you want to hear and ignore the others, the differences between them tell a story. Two, be careful with polls commissioned by partisan campaigns – if the pollsters are doing their job properly the questions will be fair and balanced, but they won’t necessarily look at the issue from all sides. Ask yourself what questions were not asked – if a poll is on the subject of a policy, proposal or suggestion and doesn’t ask whether or not people actually support that policy, ask yourself why they didn’t commission that question. Sometimes it might just be because there are a million existing polls on the subject. Sometimes it might be because they didn’t think they’d like the answer…