Polling myths

Whenever a poll goes up that shows bad news for someone you get the same sort of comments on social media. As I write this piece in May 2017 comments like these generally come from Jeremy Corbyn supporters, but that’s just the political weather at this moment in time. When the polls show Labour ahead you get almost exactly the same comments from Conservative supporters, when UKIP are doing badly you get them from UKIP supporters, when the Lib Dems are trailing you get them from Lib Dem supporters.

There are elements of opinion polling that are counter-intuitive and many of these myths will sound perfectly convincing to people who aren’t versed in how polls work. This post isn’t aimed at the hardcore conspiracists who are beyond persuasion – if you are truly convinced that polls are all a malevolent plot of some sort there is nothing I’ll be able to do to convince you. Neither is it really aimed at those who already know such arguments are nonsense: this is aimed at those people who don’t really want to believe what the polls are saying, see lots of people on social media offering comforting sounding reasons why you can ignore them, but are thinking, “Is that really true, or is it rather too convenient an excuse for waving away an uncomfortable truth…”

1) They only asked 1000 people out of 40 million. That’s not enough

This question has been about for as long as polling has. George Gallup, the trailblazer of modern polling, used to answer it by saying that it wasn’t necessary to eat a whole bowl of soup to know whether or not it was too salty, providing it had been stirred, a single spoonful was enough. The mention of stirring wasn’t just Gallup being poetic, it’s vital. Taking a single spoonful from the top of a bowl of soup might not work (that could be the spot where someone just salted it), but stirring the soup means that spoonful is representative of the whole bowl.

What makes a poll representative is not the size of the sample, it is its representativeness. You could have a huge sample size that was completely meaningless. Imagine, for example, that you did a poll of 1,000,000 over 65s. It would indeed be a huge sample, but it would be very skewed toward the Tories and Brexit. What makes a poll meaningful or not is whether it is representative of the country. Does it have the correct proportions of men and women? Old and young? Middle class and working class? Graduates and non-graduates? If the sample reflects British society as a whole in all these ways, then it should reflect it in terms of political opinion too. A poll of 1000 people is quite enough to get a representative sample.

The classic example of this was at the very birth of modern polling – in the US 1936 Presidential election a magazine called the Literary Digest did a survey of over two million people, drawn from magazine subscribers, telephone directories and so forth. It showed Alf Landon would win the Presidential election. The then newcomer George Gallup did a far, far smaller poll properly sampled by state, age, gender and so on. He correctly showed a landslide for Roosevelt. A poll with a sample skewed towards people wealthy enough to have phones and magazines in depression era America was worthless, despite have two million respondents.

2) Who do they ask? I’ve never been asked to take part in a poll!

Sometimes this is worked up to “…and neither has anyone I’ve met”, which does raise the question of whether the first thing these people do upon being introduced to a new person is to ask if MORI have ever rung them. That aside, it’s a reasonable question. If you’ve never been polled and the polls seem to disagree with your experience, where do all these answers come from?

The simple answer is that pollsters obtain their samples either by dialling randomly generated telephone numbers or by contacting people who are members of internet panels. Back when polls were mostly conducted by telephone the reason you had never been polled was simple maths – there were about forty million adults in Britain, there were about fifty or so polls of voting intention of a thousand people conducted each year. Therefore in any given year you had about a 0.1% chance of being invited to take part in a poll.

These days most opinion polls are conducted using online panels, but even if you are a member of a panel, your chances of being invited to a political poll are still relatively low. Most panels have tens of thousands of people (or for the better known companies, hundreds of thousands of people) and 95% of surveys are about commercial stuff like brands, pensions, grocery shopping and so on. You could still be waiting some time to be invited to a political one.

3) But nobody I know is voting for X!

We tend to know and socialise with people who are quite like ourselves. Our social circles will tend to be people who live in the same sort of area as us, probably people who have a similar sort of social status, a similar age. You probably have a fair amount in common with your friends or they wouldn’t be your friends. Hence people we know are more likely than the average person to agree with us (and even when they don’t, they won’t necessarily tell us; not everyone relishes a political argument). On social media it’s even worse – a large number of studies have shown that we tend to follow more people we agree with, producing self-reinforcing bubbles of opinion.

During the Labour leadership contest almost every one of my friends who is a member of the Labour party was voting for Liz Kendall. Yet the reality was that they were all from a tiny minority of 4.5% – it’s just that the Labour party members I knew all happened to be Blairite professionals working in politics in central London. Luckily I had proper polling data that was genuinely reflective of the whole of the Labour party, so I knew that Jeremy Corbyn was in fact in the lead.

In contrast to the typical friendship group, opinion polls samples will be designed so that they reflect the whole population and don’t fall into those traps. They will have the correct balance of people from all across the country, will have the correct age range, will have the correct balance of social class and past vote and so on. Perhaps there are people out there who, by some freak co-incidence, have a circle of acquaintances who form a perfectly representative sample of the whole British public, but I doubt there are very many.

4) Pollsters deliberately don’t ask Labour/Conservative supporters

In so far as there is any rationale behind the belief, it’s normally based upon the perception that someone said they were going to vote for x in a poll, and weren’t asked again. As we’ve seen above, it’s a lot more likely that the reason for this is simply that it’s relatively rare to be invited to a political poll anyway. If you’ve been asked once, the chances are you’re not going to be asked again soon whatever answers you gave.

Under the British Polling Council rules polling companies are required to publish the details of their samples – who was interviewed, what the sample was weighted by and so on. These days almost every company uses some form of political sampling or weighting to ensure that the samples are politically representative. Hence in reality pollsters deliberately include a specific proportion of 2015 Labour supporters in their polls, generally the proportion who did actually vote Labour in 2015. Pollsters are required to report these figures in their tables, or to provide them on request. Hence, if you look at last weekend’s Opinium poll you’ll find that 31% of people in the poll who voted in 2015 voted Labour, the proportion that actually did, if you look at the ICM poll you’ll find that 31% of the people who voted at the last election say they voted Labour, the proportion that actually did, and so on with every other company.

5) Pollsters are biased, and fix their figures

Again, this an accusation that is as old as polling – if you don’t like the message, say the person making it is biased. It’s made easier by the fact that a lot of people working in political polling do have a background in politics, so if you want to look for someone to build a conspiracy theory upon, you don’t need to look far. Over the years I think we’ve been accused of being biased towards and against every party at one time or another – when Labour were usually ahead in the polls YouGov used to be accused of bias because Peter Kellner was President. When the Conservatives were ahead different people accused us of being biased because Stephen Shakespeare was the CEO. The reality is, of course, that polling companies are made up of lots of people with diverse political views (which is, in fact, a great benefit when writing questions – you can get the opinion of colleagues with different opinions to your own when making sure things are fair and balanced).

The idea that polling companies would bias their results to a particular party doesn’t really chime with the economics of the business or the self-interest of companies and those who run them. Because political polls are by far the most visible output of a market research company there is a common misapprehension that it brings in lots of money. It does not. It brings in very little money and is often done as a loss-leader by companies in order to advertise their wares to the commercial companies that spend serious money doing research on brand perceptions, buying decisions and other consumer surveys. Voting intention polls are one of the very few measures of opinion that get checked against reality – it is done almost entirely as a way of the company (a) getting their name known and (b) demonstrating that their samples can accurately measure public opinion and predict behaviour. Getting elections wrong, however, risks a huge financial cost to market research companies through reputational damage and, therefore, huge financial cost to those running them. It would be downright perverse to deliberately get those polls wrong.

6) Polls always get it wrong

If the idea that polling companies would ruin themselves by deliberately getting things wrong is absurd, the idea that polls can get it wrong by poor design is sadly true: polls obviously can get it wrong. Famously they did so at the 2015 general election. Some polls also got Brexit wrong, though the picture is more mixed that some seem to think (most of the campaign polls on Brexit actually showed Leave ahead). Polls tend to get it right a lot more often than not though – even in recent years, when their record is supposed to have been so bad, the polls were broadly accurate on the London mayoral election, the Scottish Parliamentary election, the Welsh Assembly election and both of the Labour party leadership elections.

Nevertheless, it is obviously true to say that polls can be wrong. So what’s the likelihood that this election will be one of those occasions? Following the errors of the 2015 general election the British Polling Council and Market Research Society set up an independent inquiry into the polling error and what caused it, under the leadership of Professor Pat Sturgis at Southampton University. The full report is here, and if you have some spare time and want to understand how polling works and what can go wrong with them it is worth putting aside some time to read it. The extremely short version is, however, that the polls in 2015 weren’t getting samples that were representative enough of the general public – people who agreed to take part in a phone poll, or join an internet panel weren’t quite normal, they were too interested in politics, too engaged, too likely to vote.

Since then polling companies have made changes to try and address that problem. Different companies have taken different approaches. The most significant though are a mix of adding new controls on samples by education and interest in politics and changes to turnout models. We obviously won’t know until the election has finished whether these have worked or not.

So in that context, how does one judge current polls? Well, there are two things worth noting. The first is that while polls have sometimes been wrong in the past, their error has not been evenly distributed. They have not been just as likely to underestimate Labour as they have been to overestimate Labour: polling error has almost always overstated Labour support. If the polls don’t get it right, then all previous experience suggests it will be because they have shown Labour support as too *high*. Theoretically polls could have tried too hard to correct the problems of 2015 and be overstating Conservative support, but given the scale of the error in 2015 and the fact that some companies have made fairly modest adjustments, that seems unlikely to be the case across the board.

Secondly is the degree of error. When polls are wrong they are only so wrong. Even those elections where the polls got it most wrong, like 1992 and 2015, their errors were nowhere near the size of the Conservative party’s current lead.

Short version is, yes, the polls could be wrong, but even the very worst polls have not been wrong enough to cancel out the size of lead that the Tories currently have and when the polls have been that wrong, it’s always been by putting Labour too high.

So, if you aren’t the sort to go in for conspiracy theories, what comfort can I offer if the polls aren’t currently showing the results you’d like them to? Well, first the polls are only ever a snapshot of current opinion. They do not predict what will happen next week or next month, so there is usually plenty of time for them to change. Secondly, for political parties polls generally contain the seeds of their salvation, dismissing them misses the chance to find out why people aren’t voting for you, what you need to change in order to win. And finally, if all else fails, remember that public opinion and polls will eventually change, they always do. Exactly twenty years ago the polls were showing an utterly dominant Labour party almost annihilating a moribund Tory party – the pendulum will likely swing given enough time, the wheel will turn, another party will be on the up, and you’ll see Conservative party supporters on social media trying to dismiss their awful polling figures using exactly the same myths.


117 Responses to “Polling myths”

1 2 3
  1. Like the line: “people who agreed to take part in a phone poll, or join an internet panel weren’t quite normal”.

  2. Beautifully explained… thank you

  3. So why, then, have polls generally tended to overstate Labour support?

  4. Anthony, any of us who have made a career in statistics would acknowledge this as a great article. Explaining statistical methodology and making yourself look clever but incomprehensible is easy. Tailoring your message in a way that the man in the street understands without sounding patronising is far more difficult. Well done and thanks

  5. Anthony,

    Ah but your a Polster so you would say that!!!!!

    Peter.

  6. Do pollsters keep the raw data in an unaggregated form?

    I ask because one of the issues pollsters seem to have is with getting the right turnout figures as there is disagreement within the community as how to weight by turnout.

    I think I recall that Yougov does a recontact style of poll at elections so that out of the people contacted, you know who actually voted and who didn’t.

    What sort of accuracy do you get in terms of predicting who would vote and who wouldn’t? I suspect that the answer to the question “Will you Vote” will only be only of many factors which will be significant with regard to the event of someone voting or not.

  7. Clive – a good question. Personally I am convinced by the explanation that the BPC/MRS inquiry came up with (and which chimed with YouGov’s own internal analysis here), that it was down to sampling, and specifically that it was down to samples being too engaged. That, in turn, messes up differential turnout and ended up overstating Labour.

    Going back into history, you get different reasons as well – so when there were human interviewers “shy Tories” may genuinely have been a problem. The 1970 election finishing fieldwork too early was a problem. My guess is that the current problems of over-engaged respondents may have been an issue at some earlier elections too, but has become more serious as response rates fell.

  8. Alan – Yes, we do.

    There are two different things in terms of re-contact. There are asking people before and then asking them afterwards to see how your model’s prediction correlates with people’s self-reported turnout afterwards. That’s good, but the real gold standard is to go and look the individual people up on the marked electoral register afterwards to see if they *really* voted. That’s backbreakingly cumbersome and time consuming (political parties are entitled to copies of the marked register. Pollsters are not – if you want to do it, you need to do it using a pencil and paper in town hall electoral services departments).

    In our experience, just asking people wasn’t actually a bad predictor (there are some graphs illustrating it on p.43 of the BPC inquiry report), but it didn’t work so well for other companies.

  9. Thank you Anthony. I posted a question on an earlier thread that John Chanin was kind enough to answer and this answers it even more fully, is detailed and very interesting. I wonder, though, can any internet poll be representative when there is a certain demographic that doesn’t use the internet much (or at all) and doesn’t ever visit polling websites? Do you have an idea who these people are? Are they older? Less educated? Are they more likely to switch their votes? How do you factor them into your figures?

  10. @Anthony

    Actually, I think in general terms the pollsters should have a relatively tranquil election, free from the kind of nonsense your refer to in your blog.

    No-one seriously doubts that the Tories will win by a big margin. The only question is how big?

  11. I thought Chuka was pretty effective for Labour on ch 4 news.But it was pretty clear that he had circled the wagons around his constituency and he is not coming out.

    Looks like Tories are back up to 48 in latest you gov.And this before today!. Could break 50 on the morrow!

    why has no-one from our highly paid state Broadcaster asked Barnier what he meant by saying that if you walk the hills i.e TM you
    must watch out for illness. On the basis that he uses words carefully was this a reference to the Diabetes of TM. He should be asked. Our brave BBC would have been all over Nuttall if he had said it.

  12. @tintin, you’ve answered your own question. Both sample design and sample METHODS can skew results. This explains why telephone, face to face and internet polls for example are not representative. For a survey to be accurate it has to represent the population from which it is drawn which is not always the case when a survey precludes some or favours others.

  13. It’s very frustrating how the media refuse to represent survey data accurately and to distinguish clearly between representative polls and unrepresentative ones.

    Earlier tonight the Guardian claimed: “A poll this week suggested that 65% of those who backed Mélenchon will not vote for Macron.” ( https://www.theguardian.com/world/live/2017/may/03/emmanuel-macron-marine-le-pen-final-french-presidential-election-2017-debate-live-coverage )

    The so-called “poll” consisted of Mélenchon’s online consultation of those who had signed up as supporters on his website. First of all, the responses to the consultation are unweighted. Secondly, this is a completely different group of people from “those who backed Mélenchon” and therefore tells us nothing about what proportion of Mélenchon voters will vote for Macron. From actual weighted opinion polls, we know that about half of Mélenchin voters are likely to vote for Macron – not one-third as the Guardian wrongly suggested.

    The headline and first two paragraphs of the original story are equally badly misleading ( https://www.theguardian.com/world/2017/may/02/majority-of-melenchon-supporters-will-not-back-emmanuel-macron-poll-finds ).

  14. Excellent, Anthony, thank you very much.

  15. …I actually think Anthony’s article explains this nicely as in the example of George Gallup’s smaller sample being more representative of the population as a whole.

  16. tintinhaddock – using the internet less is no problem at all, so long as *some* people in every group use the internet. So for example, elderly people are less likely to use the internet, but enough elderly people do that you can deliberately recruit them and invite them to surveys in the correct proportion. Visiting polling websites isn’t necessarily an issue either – most recruitment to panels isn’t passively waiting for people to come along, it’s targetted advertising and recruitment of people you are short of (e.g. if you were really short of elderly women, you’d advertise on sites that were popular with elderly women).

    Of course, there are probably some groups it’s almost impossible to reach – the same with any other form of sampling. Ex-pats voters, for example, are wholly absent from all polls. You just need to hope that those groups are small enough not to matter.

  17. Anthony,

    Great piece, thank you.

    What is your view on whether opinion polls should be allowed to be published during campaigns? I know turkeys don’t vote for Christmas, but….
    Surely propensity to vote and destination of vote will be influenced by expectation of overall outcome?

  18. In a shocked response to the comments of TM today she has been accused of ” trying to influence the election” and of “playing politics” during…..er….. an election campaign.

    Perhaps her opponents need reminding that the object of an election is to win it. I would ,after all, be very concerned if the leader of a party was NOT trying to influence the election.

  19. In terms of sampling the “shy tories” we shouldn’t forget those other sources of Tory voters – older voters in residential homes, who get bussed to the polls in their droves on the day, plus those who simply are too wealthy to waste their time taking part in “surveys” or “panels” online. And in my political life it has always been the case (especially when canvassing) that when someone says “no thank you” and shuts the door in your face, they are going to vote Tory. If they were Labour or UKIP they would tell you. I imagine a similar attitude prevails with pollsters.

  20. @S THOMAS

    Looks like Tories are back up to 48 in latest you gov.And this before today!. Could break 50 on the morrow

    Where’s this from?

  21. HertsAndy – I suppose turkeys do vote for Christmas, since…

    (a) it would be brilliant financially for pollsters (currently voting intention polls during election periods are so common they have hardly any financial worth at all, since they are there for free in all the papers. If, however, it was illegal to publish them, we could sell them privately to financial institutions, hedge funds, currency speculators, etc, and make a fortune. If polls were banned during election campaigns, I could buy a yacht!

    but

    (b) I still think it would be a bloody silly thing to do. Propensity to vote and destination of vote would probably still be influenced without polls, but they’d be influenced by dodgy rumours of private polls, by the groupthink and speculation of political journalists, by silly voodoo polls and so on. There was a lovely example in the Oldham by-election, when in the absence of any polling evidence all the political commentators and newspapers convinced themselves that UKIP were in the running… only for them to come nowhere. People’s vote would presumably still have been influenced by the perception it was close between Labour and UKIP… the only difference was that they were being influenced by less accurate information.

    Some people are always going to be influenced by what they think the tactical political situation is and by what they think other people are doing. That is their right, to use their vote as they see fit… but I see nothing wrong with them having the most accurate information possible to base it upon, rather than rumour and leaks.

  22. Morfsky – Sam Coates from the Times tweeted it earlier (and it is, indeed, correct. I’ll probably post on it tomorrow since there are some really good questions in there that Sam hasn’t tweeted out yet)

  23. Another great article Anthony.

    S Thomas
    “Perhaps her opponents need reminding that the object of an election is to win it.”

    I’m not sure that JC thinks that. :-)

  24. Yougov and tns have conservatives and labour on 48/29 and 48/24 respectively

  25. @Mossy

    Could be a house effect for TNS as most other pollsters have Lab now around the 28-30% mark.

  26. Last 5 YouGov polls:

    Con: 48, 48, 45, 44, 48.
    Lab: 24, 25, 29, 31, 29.
    LD: 12, 12, 10, 11, 11
    Ukip: 7,5,7,6, 5

    Selection – YouGov polls published 19th April to 3rd May.

    Looking remarkably steady. Labour up a few points otherwise no real change in 2 weeks.

  27. @RAF

    True but to compare like with like You Gov represented CON +4 LAB -2 and TNS report CON +2 LAB nc

  28. But surely the only things telling us that a poll of 1m over 65’s would be skewed towards Tories and Brexit are the polls, and they almost always get it wrong…

  29. Excellent piece – the biggest frustration about your final point 6 (‘polls always get it wrong’) is that it is used so often in exactly those terms as a way of killing sensible discussion win situations where polling is useful and necessary and often the only way of having some idea of what might happen. There seems little understanding generally of that point that the Brexit polls were pretty accurate within the margins of error in a situation where the outcome between two choices was clearly so tight down to the wire.

  30. Another very good post, Anthony. Thanks

  31. @RAF – On the EU not caring if we Remain.

    That’s clearly not the case from the quotes and actions from both EU bigwigs to their placemen in the UK.

    Brexit has even been referred to as a crime.

    It’s like “Hotel California.” Nobody was ever meant to leave. In fact, there was no legal mechanism to actually leave until the British insisted on Article 50 of the Lisbon Treaty – which Lord Kerr primarily drafted.

    Art 50 is incredibly light on detail. Kerr admits that nobody in Europe thought it was remotely likely that anyone would activate it.

  32. I don’t agree that Opinion Polls are particularly inaccurate.

    Pollsters can never aspire to be oracles. They can only measure what people SAY their intentions are, at the time of survey.

    And don’t forget that some people actually decide how they’re going to vote based upon what they’ve seen in the polls.

    Thus in 2015 people could see from the polls that a likely outcome of the election was a Labour Government dependent upon the support of the SNP. So rather than risk that happening they voted Tory whereas without the polling information they might not have voted at all, or even voted Labour.

    Polls can’t easily measure effects like that, because the effect is dependent upon the people surveyed knowing the result of the poll before they’ve have actually been asked.

    If we want to predict the result of an election we need, once we know what the raw polling data says, to apply our own judgement as to what’s going to happen on the day, not assume that the poll itself can predict the result.

    But even when they are at their least accurate e.g. 2015, however, Opinion Polls taken in the final days, are remarkably close to the actual results. If I were launching a product and it gave me a forecast to within 3% of the number of people in the population who buy it I’d be delighted.

    The real accuracy of Opinion polls however, is apparent from the EXIT polls, which are remarkably accurate, the reason being, that people are telling the pollsters how they voted a few minutes previously, and are able to do so by putting their answer anonymously in a box.

    That way, the quiet people who tend to vote Conservative and don’t like expressing their political views in public, are more likely to be counted accurately. It also avoids the phenomenon where Left Wing people who are more interested in expressing their views publicly than in actually turning up to vote are not overstated in the figures.

    Conservative also voters tend to see Polling Day itself as their big day, whereas Left Wing people tend to see it part of a political campaign which, for them continues day in day out.

    There will be significant numbers of people who joined the Labour Party to vote for Corbyn as Leader who won’t vote at all on June 8th or will vote Green or Lib Dem. They’ll never admit that in an Opinion Poll but personal experience of meeting them, and hearing what they say allows us to guess that it must be true.

    The most obvious example of this phenomenon was in the Richmond By Election where Labour received fewer votes than it has Party Members in the constituency.

  33. Great read thanks. It would be nice to have a 2017 poll done with the 2015 methods and compare the difference

  34. French polling seems to be far more accurate (and less varied) than the uk polls. Why is that?

  35. AW
    VG and beautifully communicated.
    What I miss (you won’t be surprised to hear) is how do people who are politically engaged, who may have a hand in policy formulation within the party system, whether or not specifically directed to influencing VI – say on energy policy, NHS and care, or migration – should read the polls as indicators of public response?
    I can’t help feeling that there is some sense in which this is thought to be not the proper ground on which to judge reliability in the polls – yes in the sense that the polls have to obey the technical principles you so admirably describe; but no, in the sense that this is how and why the polls are read, and sometimes manipulated.
    Because what the politically engaged avidly do – to assist them in policy making, or in its communication.

  36. “Whenever a poll goes up that shows bad news for someone you get the same sort of comments on social media.”

    That’s what it’s for. :D

    Any response is better than none.

  37. Thank you Anthony, I’m not a pollster and at times the conversation on these threads seems like the ‘black arts’!
    But this is useful for me to direct friends too who often say ‘all of the above’ in defence of their political bias and the polls.

    I love the soup analogy…. but then, I love soup..
    Does the flavour matter?… :-))

  38. Anthony Wells
    ” Ex-pats voters, for example, are wholly absent from all polls. You just need to hope that those groups are small enough not to matter.”

    Do you know that for a fact (i.e., is there a question, ‘are you an expat’? Or are you assuming? I would come under the expat category and whilst I am not currently part of a polling panel, I have been a member of yougov in the past. I got fed up of polling on brands and never a political one, so I came off and of course your excellent summary explains why that is.

    Whilst I am in France, my satellite ISP is based in Manchester so anyone looking at my IP address will assume that I am there. So if you are identifying people by IP address you won’t always get an accurate idea of which country people are in. And of course others use VPNs.

    (It’s also great for overcoming the nonsensical viewing restrictions that the broadcasters impose if one lives abroad.)

  39. Anthony, I understand what you are saying but can I just give an anecdote? I live in a large 4 storey house in London (not Islington, but that kind of thing). Every month we have a group of five window cleaners (aged from 50-60) that do all the houses in the street in a morning. A great and friendly bunch of fellas, they always finish up with me for tea and biscuits and a “putting the world to rights” chat. All are now Tories and mock the Labour posters put up by my well-to-do neighbours (tho strangely, none so far this election).
    My point is these guys ARE engaged with politics but would never respond to an internet poll. I just can’t see how you reach this ‘type’ of person. You can weight it by class, age, previous vote etc but these “no nonsense” types (who all voted Brexit and have now switched from Labour to Tory) are not on your radar. Or are they?

  40. AW

    You touch on pollsters deliberately putting a wrong poll out but does that not highlight the pressures on pollsters. You tell us that these polls are a shop window for more lucrative work.Does this not encourage defensive interpretation/weighting changes. If one pollster shows a 5% lead for a particular party but all the others show them level there is surely commercial pressure to return to the pack and i would be surprised if in such circumstances senior management did not become involved.Does this happen? Is it not commercially bette r to be wrong with everyone else rather than wrong out there by yourself.

    The issue is not deliberate wrong polling but defensive interpretation/weighting for commercial reasons

  41. Sea Change,
    I think the accusation is that TM has used the EU negotiations for Electoral purposes in a way that is not in the best interest of the country. As PM whatever provocation from some EU bods she should refrain from bellicose language like Merkel does as negotiations (and hence the countries best interests) are served better through respectful engagement – it is what good leaders do but maybe Trumps style is affecting her?.
    Putting party before country, whether fair may depend on ones partisan position.

  42. Very nice article. One thing extra would be nice to clarify. Though the statistical sampling errors are far easier to understand than the systematic problems of ‘stirring’ the sample, might be nice at least to clarify how big these are for a sample of 1000. Do you have an article elsewhere on this?

  43. @David Colby,
    While it may be true that French polling has been more accurate (perhaps particularly compared with UK 2015), I think we’d need to see a full analysis. When I hear journalists remark on how accurate French polling is, I don’t trust their verdict because I never trust anything that journalists say about polls. It does appear they did a very good job ( https://en.wikipedia.org/wiki/Opinion_polling_for_the_French_presidential_election,_2017#Graphical_summary_2 ) – however, they all overestimated Le Pen’s vote, in the case of the final two pre-election polls by almost 2%, and they underestimated Fillon, in the case of the final polls by about 1%.

    These are small errors (and well within the margins of error) but it’s worth noting that if Fillon’s vote had been higher by 0.7% and Le Pen’s lower by 0.7%, then the second round would be Fillon v Macron, and everyone would be talking about how untrustworthy the polls are (even though they would still be well within their margins of error!).

  44. When there are big swings are the polls less accurate?

    I would think that calculation methods are less like to be reliable when there are large changes especially if they relate to one or two issues that are extraordinary.

  45. Fair enough but when have you ever seen a polling run in the uk as constant as this:
    Macron round two: 60,60,60,59,59.5,59,60,59,59,61,60,59,60,59,60,61,60.5,59,60.5

    And that’s from nine different companies.

  46. Myth number 7?

    Polls Predict the Future.

    People can change their minds on the morning of the 8/6/17. I think this is part of what happened in 2015. Many English voters noticed, at the last minute, that they would be supporting the SNP if they voted Labour.

    “Late Surges” are a very real possibility.

  47. AW

    Thank you for a very good presentation of the Polling World. My only question to you would be in terms of how representative the 1000 being questioned can be where there are radically different battles being fought in the different parts of the electorate.

    E.g. The electoral battle in England’s cities is quite different from that being fought in the rural areas. It could be argued that the overall picture in England is balanced by the numbers involved in polling. Out of 1000, nearly 900 will be from England, and I would agree.

    The problem, in my opinion, comes when a large number of seats can be decided by a small percentage of the electorate who are engaged in a discussion which is radically different from that involving the vast majority. This, obviously, applies to the situation in Scotland, where (if memory serves) out of the 1000 asked GB wide in one of yesterday’s polls, only c.72 were from Scotland.

    Proportionately, of course, this was the correct number. Yet because the Scottish battles are quite different from those south of the Border the margin of error becomes very great and it is difficult to know with any certainty the veracity of the sub-samples.

    I realise that pollsters are constantly searching for ways of improving their methods, but my guess is that, where there are situations such as the present variety in GB there may be limits to what can be achieved.

    Once again, many thanks for the clear and concise presentation.

  48. A couple of observations in polling

    1) Skewed questions I notice this ‘all the time’ recent examples
    Question1: Do you want a referendum within 2 years 2. A referendum in 2 years 3. No referendum. Then trumpet the 27% that selected option 1 ignore the 23% that selected option 2
    Question: If May gets X,Y,Z etc do you think this is a good deal for Britain? Of course folk do it’s having our cake and eating it.

    These answers are then used to shape opinion.

    Another observation is the time between the poll being published and the tables being available allowing the newspapers to spin as they did in questions mentioned above & also recently in the Telegraph’s 55% back Brexit headline whereas the question was if May was doing a good job of negotiating Brexit.

    So in short while the polls are likely a reasonable snap shot they are used in many cases to shape opinion not to report opinion

  49. I agree with those who say how helpful this analysis has been. One of the issues that arose regarding the referendum polls was that people who did not normally vote (usually in seats where the result was deemed to be a foregone conclusion) turned out to vote in the referendum where each vote counted equally. Is it possible that they were not on the political radar for polling?

  50. On who knows best what’s going on pollsters or….

    I would content that the political organisers of the parties know best. Canvassing can be a little like exit polling in that results from streets can be compared to previous years and extrapolated. They of course have little interest in telling anyone the truth.

1 2 3