Donald Trump has won, so we have another round of stories about polling shortcomings, though thankfully it’s someone else’s country this time round (this is very much a personal take from across an ocean – the Yougov American and British teams are quite separate, so I have no insider angle on the YouGov American polls to offer).

A couple of weeks ago I wrote about whether there was potential for the US polls to suffer the same sort of polling mishap as Britain had experienced in 2015. It now looks as if they have. The US polling industry actually has a very good record of accuracy – they obviously have a lot more contests to poll, a lot more information to hand (and probably a lot more money!), but nevertheless – if you put aside the 2000 exit poll, you have to go back to 1948 to find a complete polling catastrophe in the US. That expectation of accuracy means they’ll probably face a lot of flak in the days ahead.

We in Britain have, shall I say, more recent experience of the art of being wrong, so here’s what insight I can offer. First the Brexit comparison. I fear this will be almost universal over the next few weeks, but when it comes to polling it is questionable:

  • In the case of Brexit, the polling picture was mixed. Put crudely, telephone polls showed a clear lead for Remain, online polls showed a tight race, with leave often ahead. Our media expected Remain to win and wrongly focused only on those polls that agreed with them, leading to a false narrative of a clear Remain lead, rather than a close run thing. Some polls were wrong, but the perception that they were all off is wrong – it was a failure of interpretation.
  • In the case of the USA, the polling picture was not really mixed. With the exception of the outlying USC Dornslife/LA Times poll all the polls tended to show a picture of Clinton leading, backed up by state polls also showing Clinton leads consistent with the national polls. People were quite right to interpret the polls as showing Clinton heading towards victory… it was the polls themselves that were wrong.

How wrong were they? As I write, it looks as if Hillary Clinton will actually get the most votes, but lose in the Electoral College. In that sense, the national polls were not wrong when they showed Clinton ahead, she really was. It’s one of the most fustrating situations to be in as a pollster, those times when statistically you are correct… but your figures have told the wrong narrative, so everyone thinks you are wrong. That doesn’t get the American pollsters off the hook though: the final polls were clustered around a 4 point lead for Clinton, when in reality it looks about 1 point. More importantly, the state polls were often way out, polls had Ohio as a tight race when Trump stomped it by 8 points. All the polls in Wisconsin had Clinton clearly ahead; Trump won. Polls in Minnesota were showing Clinton leads of 5-10 points, it ended up on a knife edge. Clearly something went deeply wrong here.

Putting aside exactly how comparable the Brexit polls and the Trump polls are, there are some potential lessons in terms of polling methodology. I am no expert in US polling, so I’ll leave it to others more knowledgable than I to dig through the entrails of the election polls. However, based on my experiences of recent mishaps in British polling, there are a couple of places I would certainly start looking.

One is turnout modelling – US pollsters often approach turnout in a very different way how British pollsters traditionally did it. We’ve always relied on weighting to the profile of the whole population and asking people if they are likely to vote. US pollsters have access to far more information on which people actually do vote, allowing they to weight their samples to the profile of actual voters in a state. This has helped the normally good record of US pollsters… but carries a potential risk if the type of people who vote changes, if there is an unexpected increase in turnout among demographics who don’t usually vote. This was one of the ways British pollsters did get burnt over Brexit. After getting the 2015 election wrong lots of British companies experimented with a more US-style approach, modelling turnout on the basis of people’s demographics. Those companies then faced problems when there was unexpectedly high turnout from more working-class, less well-educated voters at the referendum. Luckily for US pollsters, the relatively easy availability of data on who voted means they should be able to rule this in or out quite easily.

The second is sampling. The inquiry into our general election polling error in 2015 found that unrepresentative samples were the core of the problem, and I can well imagine that this is a problem that risks affecting pollsters anywhere. Across the world landline penetration is falling, response rates are falling and it seems likely that the dwindling number of people still willing to take part in polls are ever more unrepresentative. In this country our samples seemed to be skewed towards people who were too educated, who paid too much attention to politics, followed the news agenda and the political media too closely. We under-represented those with little interest in politics, and several UK pollsters have since started sampling and weighting by that to try and address the issue. Were the US pollsters to suffer a similar problem one can easily imagine how it could result in polls under-representing Donald Trump’s support. If that does end up being the case, the question will be what US pollsters do to address the issue.


1,352 Responses to “Why were the US polls wrong?”

1 26 27 28
  1. tancred,
    “How can you expect people to back the EU after forty years of negative messages? I’m surprised the Brexit vote wasn’t higher.”

    This is of course true.

    Neil A,
    “I am not sure I agree that the reason Britain has never felt part of Europe is a lack of education about the EU”

    Rather, we have had a campaign of misinformation by government, as Tancred observed.

    Cloudspotter,
    “You may remember people really drilling down into the statistics after each poll. Now that would surely feel something like castles built on sand.”

    Yes and no. Polls have always had built in errors. Have always been dramatically wrong. However, they are also misinterpreted as certain when they are not and never have been. Its a guide, and arguably trends can be as significant as absolute numbers.

  2. This article completely ignores one feature which almost certainly does not apply in the UK: Voter Suppression.

    The (US) Voting Rights Act 1965 greatly strengthened the ability of minorities, principally African American to gain access to the ballot box. Since the last US General Election (2012) the Supreme Court has greatly watered down these rights, at the instigation of Repuclan controlled State Administrations.

    For example, in Wisconsin, Clinton lost by some 30,000 votes. But about 300,000 potential voters were dined the right to vote.

    In Noerth Carolina, Clinton lost by 200,000 votes and the numbers turned away were several times that.
    In Florida, Clinton lost by 120,000 and the numbers denied their votes were at least 1,500,000.

    Adding up just these three States and assuming that the predominant sympathies of those turned away was for Clinton, she would have won the entire election.

    Further, given that only three States account for some 2,000,000 US Citizens turned away, it is quite possible that the total apparent turnout drop for Democrats was accounted for by those turned away, since 2012.

    Can anyone point out were this I am wrong? I hope someone can.

1 26 27 28