The Guardian’s front page story on the NHS reports findings of a voodoo poll in their front page story:

“More than 90% of those who voted in a British Medical Journal poll believed the planned health reforms should be scrapped. Of 2,947 votes cast on bmj.com over the last week, 2,706 said the reforms should be dropped while 241 said they should stay”

The story does, at least, not claim this is specifically representative of anything, but the very fact it is reported carries the implication that it is in some way meaningful or representative of BMJ readers or people involved in the medical profession (in that sense the Guardian’s report is less bad than the PA copy, which presented the figures as being representative of BMJ readers). This was not, however, a poll in any meaningful sense, but an open access click button question on their website.

As ever, such open access polls are not properly weighted or sampled and are very easily fixed by people distributing the link to others to encourage them to vote… such as, erm, Guardian star-columnist Polly Toynbee here.

If you are a journalist reading this I again implore you to read this guidance from the British Polling Council on how journalists should report polls, particularly Q.13 on how to tell whether a poll is worth taking seriously or not.

In this particular circumstance the finding isn’t grossly misleading as there is good evidence to suggest NHS employees do indeed oppose the reforms (see, for example, this YouGov poll of NHS employees for 38 Degrees), but in a way that makes it even worse – reporting worthless findings when there are properly conducted ones out there.


Voodoo polling corner

The Press Association are reporting that “The majority of people from across the political spectrum believe Scotland should be responsible for raising most of the money it spends, according to research from an independent think-tank.”

Because it is on the Press Association feed, this is then repeated verbatim by various other newspaper websites here, here, here, etc, etc, all labouring under the misapprehension that because the Press Association reports a poll it is meaningful. They are wrong.

The “poll” was conducted by Reform Scotland, a think tank that published proposals for devolution plus earlier this year. The “sample” was drawn from people on Reform Scotland’s mailing list or following them on twitter. Needless to say, this is not a method likely to provide a representative sample of the Scottish public as a whole.

I hate to write as if addressing morons, but sadly it sometimes appears as if it is necessary. People who have signed up to follow a think tank that has proposed a devolution plus plan are, firstly, far more likely to be interested in politics (the vast majority of normal people are not on the distribution lists for think tanks!) and secondly, likely to be pre-disposed towards further devolution of power towards Scotland (for what it’s worth, the poll is also three-quarters male, only 10% over 65+ and has more Tory identifiers than Labour ones).

To give them some credit, Reform Scotland themselves haven’t claimed it is a representative poll, saying “We do not claim that this poll is totally scientific as it was self selecting. However, the responses, particularly those broken down by party affiliation, are very interesting, in particular”. Alas, the reality is that these caveats never get picked up by journalists, and such surveys inevitably end up being misreported as representative meaningful polls. For the record, the party breakdowns are not of any meaning either, since in the same way the poll overall will be grossly biased towards people with an interest in Scottish politics and a predisposition towards greater devolution, so will each of the party crossbreaks (i.e. the Labour voters in the sample will be more political and more in favour of further devolution than the average Labour voter, ditto other parties. They are also grossly demographically skewed towards younger men, and apart from the SNP have sample sizes under 100).

Over on the British Polling Council’s website there is an article written by Peter Kellner several years ago titled “A Journalist’s Guide To Opinion Polls”. Amongst other things, it gives guidance to journalists on when to take a poll seriously, and when to bin it. It is still flawless advice today:

“If the poll purports to be of the public as a whole (or a significant group of the public), has the polling company employed one of the methods outlined in points 2,3 and 4 above [quasi-random or quota sampling]? If the poll was self-selecting — such as readers of a newspaper or magazine, or television viewers writing, telephoning, emailing or texting in — then it should NEVER be presented as a representative survey.


-->

Voodoo poll update

Sometimes voodoo polls are so blatantly idiotic it feels almost superfluous to point out they are worthless. Surely no one, no one at all, could mistake them as legitimate measures of public opinion. On one level that’s probably right, but on the other hand, staying silent just encourages them.

Once upon a time lots of newspapers did silly voodoo polls, but over the years they have faded in prominence – probably because for a while they were truly ubiquitous, with every TV channel and newspaper website having silly “press the red button” or “ring this number” surveys which eventually bored everyone into submission. More positively, when the media does use them these days they do at least normally refer to them in a responsible manner – putting in appropriate caveats about them not being representative or referring to them showing x number of their readers think, rather than projecting actual results from it. I’d still rather they didn’t exist at all – since many people don’t realise the difference between properly conducted polls and voodoo polls they damage the whole reputation of the market research industry – but publishing them with caveats is better than without.

Nevertheless, when they turn up in massive font on the front page of a national newspaper claiming to be meaningful they demand appropriate mockery. Today the Express reports that “An exclusive poll conducted on the first day of our crusade showed an astonishing 99 per cent of people agree we should quit the European Union.”

It would be astonishing were it a proper measure of public opinion but, of course, it wasn’t. It is a result of inviting Express readers to phone one of two premium rate phone numbers to say whether or not they think Britain should leave Europe, advertised in the middle of a two page spread about how awful Europe is.

Obviously the context of the question is extremely skewed, the sample will be exclusively made up of people who read stories about Europe in the Daily Express (or people to whom the phone number was subsequently sent on to) and who care enough about the issue to waste their money phoning up to vote, there is unlikely to be any attempt to properly sample or weight the data, nor protections against multiple voting, nor preventing pressure groups organising people to ring up en masse. Yes, in this case it’s blindingly obvious that the poll is bunkum – but do remember the same caveats apply to all other polls that don’t take appropriate sampling or weighting measures to obtain a representative sample.

Properly conducted opinion polls on the subject of Europe show varying levels of support for leaving the EU – if you give people a straight option of saying whether or not they think Britain should withdraw though, support and opposition tend to be pretty even.

I suspect the Express didn’t find the results that astonishing anyway, since almost all of their own phone “polls” find 99% of so of respondents agree with the more reactionary option. There’s a fantastic archive of Daily Express “polls” on their website here, including such astonishing findings of 98% of respondents agreeing that “This Labour government wrecked the NHS”, 99% thinking “Labour’s treatment of the elderly a disgrace”, 98% thinking it is time to “ban immigrants” and 99% saying they are fed up with ditching British traditions. Funny that.


Many years ago Teletext (those of you over 25 will remember it) used to have phone in polls on issues of the day. On occassion they would ask voting intention, and it would invariably show the Conservatives on about 80% of the vote even in the midst of Blair’s greatest popularity – presumably because only elderly Tory voters bothered to ring into Teletext polls.

I was rather reminded of it by this from Sky News. Conducted on their own panel it has voting intentions of Conservative 43%, Labour 24%, Liberal Democrats 8% – repercentaged to exclude don’t knows and wouldn’t votes, it works out at CON 50%, LAB 28%, LDEM 10%, Others 13% – so while reputable pollsters are showing a Conservative lead of between 2 and 6 points, Sky’s panel are showing a lead of 22 points. That rings alarm bells to say the least.

This isn’t actually a voodoo poll in the purest sense, it was conducted using a panel, rather than an open access “red button” poll (although there is no indication of whether there was an attempt to draw a representative sample from within the wider panel) – but the sample looks very ropey and there is no apparent attempt at proper political weighting. There are sparse demographic details in the results, but the 2010 recalled vote break shows 44% of the sample voted Tory, compared to 17% Labour and 18% Lib Dem. For context, established polling companies like ICM weight their polls so that 25% of the sample is people who voted Tory in 2010, 21% Labour and 16% Lib Dem.

You sometimes get fun little red button polls on media websites, but they normally come with disclaimers that they are not properly represenative polls. In contrast, Sky have it as the headline on their website, liberally sprinkled with quotes from their Chief Political Correspondent Jon Craig about what it would mean if repeated at a general election. Sigh.

Ignore (and for journalists out there, this summary by Peter Kellner from the BPC website about when to pay attention to a poll is always worth revisiting).

UPDATE: Jon Craig’s blog here at least starts by acknowledging “Now I know the sniffy ones among you – yes, you know who you are – will say it’s not a wholly scientific, weighted opinion poll and all that.” On one hand, I’m pleased he’s added the caveat. On the other hand, one is rather tempted to reply that you shouldn’t bloody publish it then. Wanting polls to be scientifically weighted is not some odd personal fetish or the pedantry of pollsters and statisticians in ivory towers, it’s that all that makes a poll meaningful is that it is representative of the wider population, through proper weighting and/or sampling. A poll that doesn’t do that is just the views of an arbitary 1500 people, who do not necessary represent anyone but themselves.


A lot of comments here citing various voodoo polls from newspaper websites. Ignore them – they reflect the party allegiances of that website’s audience (the Guardian’s web poll has Brown second – shock! Daily Mail has Cameron first- wow!), do not attempt to be politically balanced or representative and are easy for interested parties to manipulate.

There were two properly conducted instant polls following the debate, carried out by YouGov and ComRes. Both show Nick Clegg winning, Cameron second and Brown last (YouGov has NC51, DC29, GB19. ComRes has NC 46, DC 26, GB 20). Angus Reid are also doing some polling, but it seems to be live overnight with final figures tomorrow.

The questions now are who people think is the winner once it has been filtered through the media tommorrow, and more importantly, what impact that has on voting intentions, how much of a boost will the Lib Dems get? Our first chance of getting a really good idea of that will probably be the polls on Saturday or Sunday.

UPDATE:
There was also an instant Populus poll for the Times. Once again, it showed Clegg winning (though by an even larger margin that the other polls!), followed by Cameron then Brown – figures were Clegg 61%, Cameron 22%, Brown 17%.