This is an updated version of a post from way back before the 2010 election, which I felt needed another airing. Thankfully comments along these lines don’t turn up very often in the comments here, but I see them with depressing frequency on Twitter when poll results are released…
1) The polls are ALL wrong, the real position is obviously X
Er… based on what? The reality is that opinion polling is pretty much the only way of measuring public opinion. We have some straws in the wind from mid-term elections, but they tend to be low turnout protest votes, don’t tend to predict general election results and are anyway quite a long time ago now. Equally a few people point to local government by-elections, but when compared to general election results these normally grossly overestimate Liberal Democrat support. If you think the polls are wrong just because they “feel” wrong to you, it probably says more about what you would like the result to be than anything about the polls.
2) I speak to lots of people and none of them will vote for X!
Actually, so do pollsters, and unless you regularly travel around the whole country and talk to an exceptionally representative demographic spread of people, they do it better than you do. We all have a tendency to be friends with people with similar beliefs and backgrounds, so it is no surprise that many people will have a social circle with largely homogenous political views. Even if you talk to a lot of strangers about politics, you yourself are probably exerting an interviewer effect in the way you ask.
3) How come I’ve never been invited to take part?
There are about 40 million adults in the UK. Each opinion poll involves about 1,000 people. If you are talking about political voting intention polls, then probably under 100 are conducted by phone each year. You can do the sums – if there are 40,000,000 adults in the UK and 100,000 are interviewed for a political opinion poll then on average you will be interviewed once every 400 years. It may be a long wait.
4) They only interview 1000 people, you’d need to interview millions of people to make it accurate!
George Gallup used to use a marvellous analogy when people raised this point: you don’t need to eat a whole bowl of soup to tell if it is too salty, providing it is sufficently stirred a single spoonful will suffice. The same applies to polls, providing an opinion poll accurately reflects the whole electorate (e.g, it has the right balance of male and female, the right age distribution, the right income distribution, people from the different regions of Britain in the correct proportions and so on) it will also accurately reflect their opinion.
In the 1930s in the USA the Literary Digest used to do mail-in polls that really did survey millions of people, literally millions. In 1936 they sent surveys to a quarter of the entire electorate and received 2 million replies. They confidently predicted that Alf Landon would win the imminent US Presidential election with 57% of the popular vote and 370 electoral votes. George Gallup meanwhile used quota sampling to interview just a few thousand people and predicted that Landon would lose miserably to Roosevelt. In reality, Roosevelt beat Landon in a landslide, winning 61% of the vote and 523 electoral votes. Gallup was right, the Digest was wrong.
As long as it is sufficent to dampen down sample error, it isn’t the number of people that were interviewed that matters, it is how representative of the population they are. The Literary Digest interviewed millions, but they were mainly affluent people so their poll wasn’t representative. Gallup interviewed only a few thousand, but his small poll was representative, so he got it right.
5) Polls give the answer the people paying for it want
The answers that most clients are interested in are the truth – polls are very expensive, if you just wanted someone to tell you what you wanted to hear there are far cheaper sources of sycophancy. The overwhelming majority of polling is private commercial polling, not stuff for newspapers, and here clients want the truth, warts and all. Polling companies do political polling for the publicity, there is comparatively little money in it. They want to show off their accuracy to impress big money clients, so it would be downright foolish for them to sacrifice their chances with the clients from whom they make the real money to satisfy the whims of clients who don’t really pay much (not to mention that most pollsters value their own professional integrity too much!).
6) Pollsters only ask the people who they know will give them the answer they want
Responses to polls on newspaper websites and forums sometimes contain bizarre statements to the effect that all the interviews must have been done in London, the Guardian’s newsroom, Conservative Central Office etc. They aren’t, polls are sampled so they have the correct proportion of people from each region of Britain. You don’t have to trust the pollsters on this – the full tables of the polls will normally have breakdowns by demographics including region, so you can see just how many people in Scotland, Wales, the South West, etc answered the poll. You can also see from the tables that the polls contain the right proportions of young people, old people and so on.
7) There is a 3% margin of error, so if the two parties are within 3% of each other they are statistically in a dead heat
No. If a poll shows one party on 46% and one party on 45% then it is impossible to be 95% confident (the confidence interval that the 3% margin of error is based upon) that the first party isn’t actually on 43%, but it is more likely than not that the party on 46% is ahead. The 3% margin of error doesn’t mean that any percentage with that plus or minus 3 point range is equally likely, 50% of the time the “real” figure will be within 1 point of the given figure.
8 ) Polls always get it wrong
In 1992 the pollsters did get it wrong, and most of them didn’t cover themselves in glory in 1997. However, lessons have been learnt and the companies themselves have changed. Most of the companies polling today did not even exist in 1992, and the methods they use are almost unrecognisable – in 1992 everyone used face-to-face polling and there was no political weighting or reallocation of don’t knows. Today polling is either done on the phone or using internet panels, and there are various different methods of political weighting, likelihood to vote filtering and re-allocation of don’t knows. In 2001 most of the pollsters performed well, in 2005 they were all within a couple of points of the actual result, in 2010 the pollsters overestimated Lib Dem support, but were very accurate on the gap between Conservative and Labour.
9) Polls never ask about don’t knows or won’t votes
Actually they always do. The newspapers publishing them may not report the figures, but they will always be available on the pollsters’ own website. Many companies (such as ICM and Populus) not only include don’t knows in their tables, but estimate how they would actually vote if there was an election tomorrow and include a proportion of them in their topline figures.
Filed under:
Methodology,
o