I’m writing a longer piece rounding up public opinion towards Brexit, but rather than let that get dominated by a big rant, I thought I’d write a separate piece about self-selecting newspaper polls apparently showing changing attitudes to Brexit. Several newspapers in the North East and the Black Country have had open-access “polls” on their websites that have had more responses supporting remain than leave. These have been widely picked up on social media by people who support Britain remaining in the EU, who have given them rather more weight that they probably would do to most open-access polls on newspaper websites.
I have spent a decade making the same post about self-selecting polls – the phone in polls in papers, press-the-red-button polls on TV, click here to vote at the bottom of newspaper articles – are bunkum. That venerable old sage Bob Worcester has been doing it for decades before me, coining the term “voodoo polls” to refer to them.
However, they keep on coming back. This time it’s a little different because of the people doing it. I keep seeing otherwise intelligent and learned people on social media quoting them. Even people who recognise that they are not a robust way of measuring public opinion sometimes suggest they must mean “something”.
They really don’t.
To roll over the old arguments again…
1) Polls are only meaningful if they are representative, and self-selecting polls are not. To be meaningful, a sample needs to mirror whatever population it is trying to measure. If it has the same gender balance, age balance, class balance, etc, then it should have the same balance of opinion. Properly conducted polls ensure this is the case using the sampling (normally using randomisation and/or demographic quotas) and weighting (adjusting the sample after fieldwork to ensure it does indeed have the right number of men and women, old and young, rich and poor). Self-selecting website & newspaper polls do not do this.
More specifically, if you look at the professionally conducted polls asking how people would vote in a referendum now, they are almost all weighted to ensure they are representative in terms of how people voted last time. For example, look at the ComRes/CNN poll, the most recent poll to ask how people would vote in a referendum today. The first question ComRes actually asked was how people voted back in June, this ensured the sample was representative, that it matched the actual figures and did not have a skew towards people who voted Remain in June or who voted Leave in June. This means we know ComRes’s sample accurately reflects the British population, and if it had shown people would vote differently now, we would have known there had been a genuine change in opinion. Without controls on this, without an attempt to make sure the sample is representative, it could just be the case that a sample is full of people who voted Remain in June to begin with.
2) Getting a large number of responses does not make a poll meaningful. Self-selecting “polls” often trumpet a large sample size as a way of suggesting their data have some validity. This is false. If you conducted a poll entirely of, say, Guardian readers, then sheer numbers would never make it representative of the whole country. This was famously tested to destruction in 1936. Back then the best known “pollster” was the Literary Digest magazine, which conducted mail in surveys. They sent out tens of millions of surveys based on subscription directories, phone directories and so on, and received literally millions of responses. Based on those they confidently predicted that Alf Landon would win the 1936 Presidential election. George Gallup, the pioneer of modern polling methods, had a far smaller sample, but made it properly representative – as a result he accurately predicted Roosevelt’s landslide victory. It was the birth of modern polling but, alas, not the death of newspapers conducting voodoo polls.
3) Self-selecting polls reflect the views of those who are interested and have an axe to grind. Any poll where people can choose to take part, rather than the polling company controlling who is invited, will attract those people with strong views on the subject and under-represent those with only a limited interest (indeed, this is a problem that even proper polls are not immune to, given low response rates). It probably explains some of the apparent shift from the pre-referendum newspaper “polls” to the current ones – before the referendum it was angry Leave supporters with the motivation to take part in “polls” on obscure newspaper websites; now the Leave supporters have what they want, and it is the frustrated Remainers hammering away on website “polls”, grasping for evidence that public opinion has moved their way. To measure public opinion properly, you need to represent all the public, not just those who are the most fired up.
4) Self-selecting polls can be orchestrated and fixed. Open access polls don’t usually have any limits on who can take part (do you actually live in the Black Country?) or any method of preventing multiple votes. They can never stop people sharing or distributing the link to like minded people to encourage them to vote (indeed, given they often exist purely as clickbait, they are very much intended to be distributed in that way). This doesn’t have to be an organised attempt, as much as people sharing links on social media with a “vote here and show people that not everyone wants Brexit”. Again, properly controlled polls have measures preventing such manipulation or skews.
Almost no new information is entirely useless. If we lived in an information vacuum then these sort of things would perhaps be an interesting straw in the wind, a pointer to something that might be worth proper investigation. We do not live in an information vacuum though – there have been numerous properly conducted opinion polls using properly representative samples over the last six months (I will write something on them later, but in the meantime John Curtice has collected them at the beginning of this article.) and these have painted a different picture. Properly conducted polls in recent months have consistently shown very little net movement in whether the British public want to leave or remain in the EU.
There are only three obvious ways of resolving the conflict between the picture painted by professionally conducted national polls and the self-selecting website polls. One, that that has been a vast shift of opinion in favour of Remain in the North East and the Black Country, but it has been balanced out by a big shift in the other direction elsewhere in the country so at a national level it evens out. Two, that professionally conducted polls with representative samples (polls that even on their very worst days, still end up within a handful of percentage points of the actual result) have somehow completely missed a 45% swing in public attitudes to Brexit. Or three, that open-access polls on newspaper websites that let any old bugger vote multiple times without any attempt to get a sample that is politically or demographically representative are an utterly useless way of measuring opinion.
I know it’s the third explanation, and deep in their hearts, I think most of those people sharing them know it’s the third explanation too. The kindest advice I can give to those who would like Britain to remain in the EU is that they need to change public opinion, not grasp at voodoo polls to kid themselves that it already agrees with them.