I thought it a good opportunity to provide a round up of the available evidence we have about what the public think of airstrikes against Islamic State in Syria… and to give a reminder to people of what is NOT good evidence. First, here’s the recent polling evidence:
- Survation/Mirror polled after Cameron’s statement this week, and found 48% agreed with Britain beginning airstrikes against Islamic State alongside France and the US, 30% of people disagreed (tabs here)
- YouGov earlier this week asked if people would approve or disapprove of the RAF taking part in airstrikes against ISIS in Syria. 59% would approve, 20% would disapprove (tabs here).
- ComRes for the Indy on Sunday asked a question on whether people would support Britain taking part in airstrikes against ISIS without UN approval (there wasn’t a parallel version with UN approval). 46% would support airstrikes without UN approval, 32% would not. Tabs are here. A slightly earlier ComRes poll for the Daily Mail asked if people would back British military air strikes against Syria – 60% would, 24% would oppose (tabs here)
- BMG for the Standard asked a question on whether Britain should extend it’s current airstrikes against ISIS in Iraq to cover Syria as well. This found an even split – 50% thought they should, 50% thought they should not (tabs here)
- I don’t think ICM have asked a direct support/oppose bombing question, but last week they asked a question about Parliamentary consent. It found 46% supported airstrikes if Parliament agreed, 23% supported airstrikes without Parliamentary consent, 12% opposed airstrikes regardless, 19% didn’t know (tabs here)
The precise levels of support differ from poll to poll as they are all asking slightly different questions using slightly different wordings. However the overall picture is pretty clearly one where the balance of public support is in favour of airstrikes – even the most sceptical finding, from BMG, has people evenly divided. That’s not to say British public opinion is gung-ho enthusiasm for conflict, if you look through the rest of those surveys there is plenty of doubt (for example, several polls have found that people think intervening will increase the risk of terrorism here in Britain). On balance, however, public opinion seems to be in favour.
On twitter and other social media there is lots of sharing of a “poll” by ITV that apparently shows a large majority against Britain taking part. The reason this “poll” gives such sharply different answers is because it is not representative and has no controls upon it. I have written about this many, many times (and for many decades before I was writing the great Bob Worcester dutifully fought that same long fight). The sort of open access polls that used to be on Ceefax, and for people to phone in to newspapers, and these days pop up at the bottom of newspaper stories and the sidebar of websites are completely useless as a way of accurately measuring public opinion.
Opinion polls are meaningful for one reason and one reason alone, because the sample is representative. It has the right number of young people and old people as Britain as a whole, the same number of rich people and poor people as Britain as a whole, the same numbers of left-wing and right-wing people… and therefore, it should have the same proportion of people who are anti-bombing and pro-bombing as there are in Britain as a whole. An open-access poll on a website has no such controls.
When a poll is properly done the researcher will use some sort of sampling method that produces a sample that is demographically representative of the country as a whole. Then when it’s finished, they’ll fine tune it using weighting to make sure it is representative (e.g. if the proportion of women in the sample is lower than 51% they’ll weight the female respondents up). The people answering the poll will be invited and contacted by the researcher, preventing people or organised campaigns skewing a poll by deliberately directing lots of people who share their views to fill it in.
Numbers alone do not make a poll representative. A common error is to see a professionally conducted poll of a 1000 people and a bigger open-access “poll” of 10,000 people and think that the latter is therefore more meaningful. This is wrong – it’s how representative the sample is that matters, not how big it is. The classic example of this is the 1936 US Presidential race, the one that made the reputation of George Gallup. Back then the Literary Digest conducted a mail-in poll with a sample size of 2.3 million people, Gallup conducted a normal sized professional poll. The Digest’s poll predicted that Alf Landon would easily win the election, Gallup correctly predicted that Roosevelt would win a crushing landslide. The problem was that while the Literary Digest’s poll had a vast sample (probably the biggest sample of any opinion poll, ever) it wasn’t representative, it was skewed towards richer people who were more likely to vote Republican. Gallup’s sample was tiny compared to his competitor, but it had proper controls and was properly representative.
Unlike the polls by ComRes, ICM, Survation and YouGov the ITV “poll” won’t have controls to make sure the sample is representative of the British public – indeed, they don’t even collect any demographics to see whether it is or not. There is nothing stopping organised campaigns seeking to influence an open poll – for example, StoptheWar could’ve sent an email out to their mailing list encouraging them all to fill it in. There is nothing stopping anyone with the wherewithal to delete a cookie from their computer voting many, many times. It is, in short, meaningless.
Following May the properly conducted polls got something wrong too of course – but that’s a reason to be wary of even properly conducted polls, not a reason to suddenly put trust in “polls” that don’t even attempt to do things properly.
“Polls” like this, which Bob Worcester christened “Voodoo polls”, are a blight on journalism and opinion research. Because to the casual observer they can’t be easily distinguished from a properly conducted poll they mislead readers into thinking they are meaningful. I assume newspaper websites use them because they drive traffic, but serving up “news” that any journalist with a basic grasp of stats should know is meaningless – or in this case, actively wrong – is doing a disservice to their readers and listeners. At least when the BBC do things like this caveats are normally added saying the “poll” was self-selecting and isn’t representative. ITV don’t caveat it at all, so who can blame all the excited anti-airstrikes people on social media for thinking it means something and jumping upon it? I’m sorry, but it doesn’t – properly conducted polling suggests the public split broadly in favour.
Of course, none of this means that it is necessarily correct for Britain to take part in airstrikes. Polls are not a magic 8 ball where you ask the public and they spit out the “correct” answer. Public opinion can be wrong, and often is. The evidence is that public opinion favours bombing ISIS in Syria, it doesn’t necessarily follow that the government or opposition should do so.