I thought it a good opportunity to provide a round up of the available evidence we have about what the public think of airstrikes against Islamic State in Syria… and to give a reminder to people of what is NOT good evidence. First, here’s the recent polling evidence:

  • Survation/Mirror polled after Cameron’s statement this week, and found 48% agreed with Britain beginning airstrikes against Islamic State alongside France and the US, 30% of people disagreed (tabs here)
  • YouGov earlier this week asked if people would approve or disapprove of the RAF taking part in airstrikes against ISIS in Syria. 59% would approve, 20% would disapprove (tabs here).
  • ComRes for the Indy on Sunday asked a question on whether people would support Britain taking part in airstrikes against ISIS without UN approval (there wasn’t a parallel version with UN approval). 46% would support airstrikes without UN approval, 32% would not. Tabs are here. A slightly earlier ComRes poll for the Daily Mail asked if people would back British military air strikes against Syria – 60% would, 24% would oppose (tabs here)
  • BMG for the Standard asked a question on whether Britain should extend it’s current airstrikes against ISIS in Iraq to cover Syria as well. This found an even split – 50% thought they should, 50% thought they should not (tabs here)
  • I don’t think ICM have asked a direct support/oppose bombing question, but last week they asked a question about Parliamentary consent. It found 46% supported airstrikes if Parliament agreed, 23% supported airstrikes without Parliamentary consent, 12% opposed airstrikes regardless, 19% didn’t know (tabs here)

The precise levels of support differ from poll to poll as they are all asking slightly different questions using slightly different wordings. However the overall picture is pretty clearly one where the balance of public support is in favour of airstrikes – even the most sceptical finding, from BMG, has people evenly divided. That’s not to say British public opinion is gung-ho enthusiasm for conflict, if you look through the rest of those surveys there is plenty of doubt (for example, several polls have found that people think intervening will increase the risk of terrorism here in Britain). On balance, however, public opinion seems to be in favour.

On twitter and other social media there is lots of sharing of a “poll” by ITV that apparently shows a large majority against Britain taking part. The reason this “poll” gives such sharply different answers is because it is not representative and has no controls upon it. I have written about this many, many times (and for many decades before I was writing the great Bob Worcester dutifully fought that same long fight). The sort of open access polls that used to be on Ceefax, and for people to phone in to newspapers, and these days pop up at the bottom of newspaper stories and the sidebar of websites are completely useless as a way of accurately measuring public opinion.

Opinion polls are meaningful for one reason and one reason alone, because the sample is representative. It has the right number of young people and old people as Britain as a whole, the same number of rich people and poor people as Britain as a whole, the same numbers of left-wing and right-wing people… and therefore, it should have the same proportion of people who are anti-bombing and pro-bombing as there are in Britain as a whole. An open-access poll on a website has no such controls.

When a poll is properly done the researcher will use some sort of sampling method that produces a sample that is demographically representative of the country as a whole. Then when it’s finished, they’ll fine tune it using weighting to make sure it is representative (e.g. if the proportion of women in the sample is lower than 51% they’ll weight the female respondents up). The people answering the poll will be invited and contacted by the researcher, preventing people or organised campaigns skewing a poll by deliberately directing lots of people who share their views to fill it in.

Numbers alone do not make a poll representative. A common error is to see a professionally conducted poll of a 1000 people and a bigger open-access “poll” of 10,000 people and think that the latter is therefore more meaningful. This is wrong – it’s how representative the sample is that matters, not how big it is. The classic example of this is the 1936 US Presidential race, the one that made the reputation of George Gallup. Back then the Literary Digest conducted a mail-in poll with a sample size of 2.3 million people, Gallup conducted a normal sized professional poll. The Digest’s poll predicted that Alf Landon would easily win the election, Gallup correctly predicted that Roosevelt would win a crushing landslide. The problem was that while the Literary Digest’s poll had a vast sample (probably the biggest sample of any opinion poll, ever) it wasn’t representative, it was skewed towards richer people who were more likely to vote Republican. Gallup’s sample was tiny compared to his competitor, but it had proper controls and was properly representative.

Unlike the polls by ComRes, ICM, Survation and YouGov the ITV “poll” won’t have controls to make sure the sample is representative of the British public – indeed, they don’t even collect any demographics to see whether it is or not. There is nothing stopping organised campaigns seeking to influence an open poll – for example, StoptheWar could’ve sent an email out to their mailing list encouraging them all to fill it in. There is nothing stopping anyone with the wherewithal to delete a cookie from their computer voting many, many times. It is, in short, meaningless.

Following May the properly conducted polls got something wrong too of course – but that’s a reason to be wary of even properly conducted polls, not a reason to suddenly put trust in “polls” that don’t even attempt to do things properly.

“Polls” like this, which Bob Worcester christened “Voodoo polls”, are a blight on journalism and opinion research. Because to the casual observer they can’t be easily distinguished from a properly conducted poll they mislead readers into thinking they are meaningful. I assume newspaper websites use them because they drive traffic, but serving up “news” that any journalist with a basic grasp of stats should know is meaningless – or in this case, actively wrong – is doing a disservice to their readers and listeners. At least when the BBC do things like this caveats are normally added saying the “poll” was self-selecting and isn’t representative. ITV don’t caveat it at all, so who can blame all the excited anti-airstrikes people on social media for thinking it means something and jumping upon it? I’m sorry, but it doesn’t – properly conducted polling suggests the public split broadly in favour.

Of course, none of this means that it is necessarily correct for Britain to take part in airstrikes. Polls are not a magic 8 ball where you ask the public and they spit out the “correct” answer. Public opinion can be wrong, and often is. The evidence is that public opinion favours bombing ISIS in Syria, it doesn’t necessarily follow that the government or opposition should do so.

Voodoo polling corner

Back in 2012 I wrote about the Observer reporting an open-access poll on a website campaigning against the government’s health bill as if it was representative of members of the Royal College of Physicians. I also wrote to the Observer’s readers’ editor, Stephen Pritchard, who wrote this article about it.

The Guardian today is making the same error – they have an article claiming that seven out of ten junior doctors will leave the profession if the new junior doctor’s contract goes through. The headline presents it as representative of all junior doctors and it is referred to as a poll and a survey in the first two paragraphs. Only in the final, seventeenth paragraph is it revealed that it wasn’t conducted by any reputable market research organisation, but a self conducted survey of members of a Facebook group, the Junior Doctors Contract Forum, which is campaigning against the new contract (the Telegraph had a similar article earlier this month that appears to be based on the same data).

We cannot tell if efforts were made to limit the poll to actual doctors or to make it representative of junior doctors in terms of career stage, age, region and so on – it doesn’t really matter, as it is fatally undermined by being conducted in a forum campaigning against a contract. It would be like conducting a poll on fox hunting in the Countryside Alliance’s Facebook group and presenting that as representative of the countryside’s views on foxhunting. The flaw should be screamingly obvious.

Questions along the lines of “If thing you oppose happens, will you do x?” are extremely dubious anyway. The problem is that respondents to opinion polls are not lab rats, they are human beings who seek to use polls to express their opinion, even when it’s not exactly what the question asks. From a respondent’s point of view, if you are filling in a survey about something you oppose, you’re are likely to give the answers that most effectively express your opposition. Faced with a question like this, it’s far more effective to say you might leave your job if your contract is changed than say you’d meekly accept it and carry on as usual.

We see this again and again in polls seeking to measure the impact of policies. For example, before tuition fees were increased there were lots of polls claiming to show how many young people would be put off going to university by increased fees (such as here and here). After the rise, they miraculously continued to apply anyway. Nobody wants to tell a pollster that they would just swallow the thing they oppose.

I don’t doubt that many or most junior doctors are unhappy with the new contract, but you can’t get a representative poll by surveying campaigning groups, and you shouldn’t necessarily believe people telling pollsters about the awful consequences that will happen if something they don’t like happens. It’s a lot easier to make a threat to a pollster that you’ll resign from your job than it is to actually do it.

UPDATE: While I’m here in voodoo polling corner, I should also highlight this cracking example of a voodoo poll in the Daily Mail. It claims “One in three women admit they watch porn at least once a week”… but it seems to be an open access poll of Marie Claire readers, certainly it is in no way representative of all women in terms of things like age. It contains the delightful line that “Out of the more than 3,000 women surveyed, 91 per cent of the survey’s respondents identify as female, eight per cent identify as men and one per cent is transgender.” I don’t know how to break it to them, but you probably can’t include the 8% who are men in a survey of 3000 women.


I hope most of my regular readers would assume a Daily Express headline about a “poll” showing 80% of people want to leave the EU was nonsense anyway, but it’s a new year, a new election campaign, and it’s probably worth writing again about why these things are worthless and misleading as measures of public opinion. If nothing else, it will give people an explanation to point rather overexcited people on Twitter towards.

The Express headline is “80% want to quit the EU, Biggest poll in 40 years boosts Daily Express crusade”. This doesn’t actually refer to a sampled and weighted opinion poll, but to a campaign run by two Tory MPs (Peter Bone and Philip Hollobone) and a Tory candidate (Thomas Pursglove) consisting of them delivering their own ballot papers to houses in their constituencies. They apparently got about 14,000 responses, which is impressive as a campaigning exercise, but doesn’t suddenly make it a meaningful measure of public opinion.

Polls are meaningful only to the extent that they are representative of the wider public – if they contain the correct proportions of people of different ages, of men and women, of different social classes and incomes and from different parts of the country as the population as a whole then we hope they should also hold the same views of the population as a whole. Just getting a lot of people to take part does not in any way guarantee that the balance of people who end up taking the poll will be representative.

I expect lots of people who aren’t familiar with how polling works will see a claim like this, see that 14,000 took part, and think it must therefore be meaningful (in the same way, a naive criticism of polls is often that they only interview 1000 people). The best example of why this doesn’t work was the polling for the 1936 Presidential election in the USA, which heralded modern polling and tested big sample sizes to destruction. Back then the most well known poll was that done by a magazine, the Literary Digest. The Literary Digest too sent out ballot papers to as many people as it could – it sent them to its subscribers, to other subscription lists, to everyone in the phone directory, to everyone with a car, etc, etc. In 1936 it sent out 10 million ballot papers and received two point four million responses. Based on these replies, they confidently predicted that the Republican candidate Alf Landon would win the election. Meanwhile the then little known George Gallup interviewed just a few thousand people, but using proper demographic quotas to get a sample that was representative of the American public. Gallup’s data predicted a landslide win for the Democrat candidate Franklin D Roosevelt. Gallup was of course right, the Literary Digest embarrassingly wrong. The reason was that the Literary Digest’s huge sample of 2.4 million was drawn from the sort of people who had telephones, cars and magazine subscriptions and, in depression era America, these people voted Republican.

Coming back to the Express’s “poll”, a campaign about leaving Europe run by three Tory election candidates in the East Midlands is likely to largely be responded to by Conservative sympathisers with strong views about Europe, hence the result. Luckily we have lots of properly conducted polls that are sampled and weighted to be representative of whole British public and they consistently show a different picture. There are some differences between different companies – YouGov ask it a couple of time a month and find support for leaving the EU varying between 37% and 44%, Survation asked a couple of months ago and found support for leaving at 47%, Opinium have shown it as high as 48%. For those still entranced by large sample sizes, Lord Ashcroft did a poll of 20,000 people on the subject of Europe last year (strangely larger than the Express’s “largest poll for 40 years”!) and found people splitting down the middle 41% stay – 41% leave.

And that’s about where we are – there’s some difference between different pollsters, but the broad picture is that the British public are NOT overwhelmingly in favour of leaving the EU, they are pretty evenly divided over whether to stay in the European Union or not.

Compare and contrast

A couple of weeks ago someone sent me a link to a “poll” in the Tab (which, one understands, is some form of newspaper for students) that claimed to show Conservatives were in the lead amongst students. Nonsense of course, it was an open access voodoo poll with no attempt to get a meaningful or representative sample (hell, 10% of the sample were Cambridge students!). Of course, it was only a poll in a campus newspaper so I didn’t bother writing rude things about it, the only other media I found foolish enough to cite it were Vice and Breitbart.

Just for the record though, today’s Independent has a properly conducted poll of students by YouthSight (we’ve met them here before, under the name of Opinionpanel). This was a panel based survey amongst undergraduate full-time students, recruited via UCAS and validated through an ac.uk email address, weighted by type of university (Russell, pre-1992, post-1992, specialist), year of study and gender. In contrast to the voodoo poll above, it shows Labour with a solid lead amongst students who say they are likely to vote – Labour 43%, Conservatives 24%, Lib Dems 6%, Greens 14%, UKIP 5%. Compare and contrast.

A couple of years ago I wrote a piece giving advice on how to report opinion polls, or rather, how not to. Look specifically at the third point on being careful of extremely small sample sizes in cross-breaks.

There was a cracking example of the media failing on this front on BBC Look East this week, which has done the rounds on Twitter. The clip is here, the offending portion starts at the three minute mark. It claims to show the results of a poll of the Eastern region that put UKIP on 44% of the vote.

The figures come from page 36 of this ComRes poll. It wasn’t a bespoke, properly weighted poll of the Eastern region. It’s a crossbreak on a normal national poll. The figures are based upon only only 58 respondents, giving a margin of error of plus or minus 13 points. The figures are not even accurately quoted, the Lib Dems are actually on 7%. The were no caveats about sample size offered (the youtube clip from UKIP cuts out suddenly, but at the moment the full programme is on iplayer). This is truly appalling reporting of polls – they is no way that such a tiny cross-break should be reported out of context as if it were a representative poll.