The Evening Standard have published a new BMG poll of the Richmond Park by-election, suggesting a significantly less exciting race than some people thought (and than the Lib Dems hoped). Topline voting intention figures are:

GOLDSMITH (Ind) 56% (down 2 from the Con share in 2015)
OLNEY (Lib Dem) 29% (up 10 from the LD share in 2015)
LABOUR 10% (-2)
OTHER 5% (-5)

While there is a month to go, this suggests that Goldsmith should hold the seat relatively easily. The idea that, with both main candidates opposing Heathrow expansion, it could become an by-election about Brexit in a pro-EU seat doesn’t really seem to working out at present. 25% of voters say that Brexit will be the most important issue in deciding their vote, but they are mostly voting Lib Dem and Labour already. Goldsmith’s voters say their most important considerations are Goldsmith’s own record and views, followed by Heathrow opposition.

BMG also asked people how they would have voted if the Conservatives had put up an official Conservative candidate against Goldsmith. Topline figures would have been GOLDSMITH 34%, LIB DEM 25%, CONSERVATIVE 20% – so the race would have been far more competitive, but with the Tories trailing in third place. It was an unusual decision not to stand, but the polling suggests it was the right one for the Tories (or at least, neither option would have produced a Tory MP, but the Conservatives presumably prefer Goldsmith winning to a Lib Dem). Full details are here.

Donald Trump has been citing Brexit as the model of how he could win the election despite expections, his surrogates of how there might be a shy Trump vote, like Brexit. So what, if any, lessons can we learn about the US election from recent polling experience in Britain?

In 2015 the British polls got the general election wrong. Every company had Labour and Conservative pretty much neck-and-neck, when in reality the Conservatives won by seven points. In contrast, the opinion polls as a whole were not wrong on Brexit, or at least, they were not all that wrong. Throughout the referendum campaign polls conducted by telephone generally showed Remain ahead, but polls conducted online generally showed a very tight race. Most of the online polls towards the end of the campaign showed Leave ahead, and polls by TNS and Opinium showed Leave ahead in their final eve-of-referendum polls.

That’s the first point that the parallel falls down – Brexit wasn’t a surprise because the polls were wrong. The polls were showing a race that was neck-and-neck. It was a surprise because people hadn’t believed or paid attention to that polling evidence. The media expected Remain would win, took polls showing Remain ahead more seriously and a false narrative built up that the telephone polls were more accurately reflecting the race when in the event, those online polls showing leave ahead were right. This is not the case in the US – the media don’t think Trump will lose because they are downplaying inconvenient polling evidence, they think Trump will lose because of the polling evidence consistently shows that.

In the 2015 general election however the British polls really were wrong, and while some of the polls got Brexit right, some did indeed show solid Leave victories. Do either of those have any relevance for Trump?

The first claim is the case of shy voters. Much as 1948 is the famous examples of polling failure in the US, in this country 1992 was the famous mistake, and was put down to “Shy Tories”. That is, people who intended to vote Conservative, but were unwilling to admit it to pollsters. Shy voters are extremely difficult to diagnose. If people lie to pollsters about how they’ll vote before the election but tell the truth afterwards, then it is impossible to distinguish “shy voters” from people changing their minds (in the case of recent British polls, this does not appear to be the case. In both the 2015 election and the 2016 EU referendum recontact surveys found no significant movement towards the Conservatives or towards Leave). Alternatively, if people are consistent in lying to pollsters about their intentions beforehand and lying about how they voted afterwards, it’s impossible to catch them out.

The one indirect way of diagnosing shy voters is to compare the answers given to surveys using live interviewers, and surveys conducted online (or in the US, using robocalls – something that isn’t regularly done in the UK). If people are reluctant to admit to voting a certain way, they should be less embarrassed when it isn’t an actual human being doing the interviewing. In the UK the inquiry used this approach to rule out “shy Tories” as a cause of the 2015 polling error (online polls did not have a higher level of Tory support than phone polls).

In the US election there does appear to be some prima facie evidence of “Shy Trumpers”* – online polls and robopolls have tended to produce better figures for Donald Trump than polls conducted by a human interviewer. However, when this same difference was evident during the primary season the polls without a live interviewer were not consistently more accurate (and besides, even polls conducted without a human interviewer still have Clinton reliably ahead).

The more interesting issue is sample error. It is wrong to read directly across from Brexit to Trump – while there are superficial similarities, these are different countries, very different sorts of elections, in different party systems and traditions. There will be many different drivers of support. To my mind the interesting similarity though is the demographics – the type of people who vote for Trump and voted for Brexit.

Going back to the British general election of 2015, the inquiry afterwards identified sampling error as the cause of the polling error: the sort of people who were able to be contacted by phone and agreed to take part, and the sort of people who joined online panels were unrepresentative in a way that weights and quotas were not then correcting. While the inquiry didn’t specify how the samples were wrong, my own view (and one that is shared by some other pollsters) is that the root cause was that polling samples were too engaged, too political, too educated. We disproportionately got politically-aware graduates, the sort of people who follow politics in the media and understand what is going on. We don’t get enough of the poorly educated who pay little attention to politics. Since then several British companies have adopted extra weights and quotas by education level and level of interest in politics.

The relevance for Brexit polling is that there was a strong correlation between educational qualification and how people voted. Even within age cohorts, graduates were more likely to vote to Remain, people with few or no educational qualifications were more likely to vote to Leave. People with a low level of interest in politics were also more likely to vote to Leave. These continuing sampling issues may well have contributed to some of those pollsters who did it wrong in June.

One thing that Brexit does have in common with Trump is those demographics. Trump’s support is much greater among those without a college degree. I suspect if you asked you’d find it was greater among those people who don’t normally pay much attention to politics. In the UK those are groups who we’ve had difficulty in properly representing in polling samples – if US pollsters have similar issues, then there is a potential source for error. College degree seems to be a relatively standard demographic in US polling, so I assume that is correct already. How much interest people have in politics is more nebulous, less easy to measure or control.

In Britain the root cause of polling mishaps in 2015 (and for some, but not all, companies in 2016) seems to be that the declining pool of people still willing to take part in polls under-represented certain groups, and that those groups were less likely to vote for Labour, more likely to vote for Brexit. If (and it’s a huge if – I am only reporting the British experience, not passing judgement on American polls) the sort of people who American pollsters struggle to reach in these days of declining response rates are more likely to vote for Trump, then they may experience similar problems.

Those thinking that the sort of error that affected British polls could happen in the US are indeed correct… but could happen is not the same as is happening. Saying something is possible is a long way from there being any evidence that is actually is happening. Some of the British polls got Brexit wrong, and Trump is a little bit Brexity, therefore the polls are wrong really doesn’t carry water.


*This has no place in a sensible article about polling methodology, but I feel I should point out to US readers that in British schoolboy slang when I was a kid – and possibly still today – to Trump is to fart. “Shy Trump” sounds like it should refer to surreptitiously breaking wind and denying it.


Ipsos MORI have published their monthly political monitor and it shows another towering lead for the Conservatives. Topline voting intentions are CON 47%(+7), LAB 29%(-5), LDEM 7%(+1), UKIP 6%(-3). The eighteen point Conservative lead is the highest they’ve managed in any poll since 2009, and the highest lead for a party in government since 2002. Usual caveats apply about any poll showing such a large shift in support over a month, but in terms of direction this does echo the ICM and YouGov polls earlier this month that also showed shifts towards the Conservatives. Full details are here.

A quick word about that UKIP score of just 6%. While it is obviously very bad, it’s not the sudden collapse one might assume. For whatever methodological reason, MORI do tend to show significantly worse scores for UKIP than polls from other companies. It is NOT a case of UKIP support being at 11% with ICM and YouGov last week, their MEPs getting into a fist fight and their support collapsing (however tempting such a narrative is!). MORI has been showing them at significantly lower levels of support for several months anyway – 9% last month, 6% in August, 8% in July. Nevertheless, it does appear as if the Tories are beginning to claw back support they’d previously lost to UKIP.

A referendum is not like an election. While the two sides of the campaign produced lots of literature supporting their view, there wasn’t anything like a manifesto as such. How could there be, given those leading the campaigns were not those who would end up actually implementing the decision? As far as the referendum was concerned, Brexit did indeed just mean Brexit – no more and no less. Nothing on the ballot paper said it was specifically this sort of Brexit or that sort of Brexit.

This has left a certain void, and one that politicians and others have sought to fill. Naturally, they have largely attempted to do so with their pre-existing prejudices rather than evidence. To listen to some it would appear that Brexit was driven by people who wanted {insert policy idea that I wanted to begin with}. A lot of this has been around how important an issue immigration was to Leave voters, and too what extent this was an anti-immigration vote. The alternative argument is often that the vote was mostly driven by concerns about sovereignty and freedom.

At the simplest level, if you want to know why people voted for something… ask them.

In YouGov’s final poll they asked people to pick which one factor was most important to people in deciding how to vote. Among Leave voters the most popular answer was allowing Britain to act independently (45%), followed by immigration (35%) and the economy (8%). Full tabs are here.

Lord Ashcroft’s poll after the referendum asked leave voters to rank four possible reasons for the vote – sovereignty, immigration, the economy or the risk of future EU integration. 49% of Leave voters picked sovereignty as their first reason (78% as either their first or second answer), 33% of Leave voters picked immigration as their first reason (64% as either their first or second reason). These two issues dominate, but the structure of the question suggests that people couldn’t say “I didn’t care about this issue at all”, so its somewhat limited (tabs are here, the relevant questions are on page 256!)

In both of these examples sovereignty came top, followed by immigration. However, it’s possible that this was down to the particular options the pollster offered or the particular wording used in the question. One way of getting round this issue is to ask it as an open-ended question and allow people to say in their own words why they voted as they did – two other polls did this.

In Ipsos-MORI’s final poll they asked what issues would be important to people in deciding how to vote in the referendum, letting people pick more than one option. The interviewer then picked which category or categories matched their answer most closely. In this case immigration came top among Leave voters, picked by 54% (18% also said the cost of immigration on welfare and 12% said the number of refugees coming to Britain – though given people could choose more than one option these cannot be added together). The next highest option among Leave voters was 32% who said the ability of Britain to pass our own laws, followed by 19% who said the economy and 9% who said jobs.

The pre-election wave of the British Election Study did a similar thing, asking respondents to type in what the most important factor driving their vote was and coding it up later. Taking a word cloud of the responses gives one extremely prominent answer…


…but this is actually a little misleading. Once the answers are coded up individually sovereignty comes very narrowly ahead of immigration. Just over 30% of verbatim responses from Leave voters mentioned sovereignty or control in some way, just under 30% mentioned immigration in some way (the word cloud appears as it does because most people who mentioned immigration used the specific word immigration, but people who mentioned sovereignty used a variety of different terms like sovereignty, control, making laws and so on). Suffice to say, immigration and sovereignty were, between them, the main two issues driving the Leave vote.

Referendums and elections are complicated things, and the human beings who vote in them are even more so. Anyone who tries to boil down the referendum to one factor and say “this explains it all” will almost always be wrong. While the order of the two issues differs between polls, all the polling evidence is clear that Leave voters were most concerned about the issues of sovereignty and immigration, and anyone claiming they were motivated by one but not the other is very likely projecting their own views onto the voters.

While they were clearly the dominant issues, there are undoubtedly others too – for example, as John Curtice explores here, there’s a very strong correlation with views on the impact of Brexit on the economy too, so while immigration and sovereignty were strong factors in favour of Leaving, another important factor seems to be that most leave voters did NOT think that Brexit would bring economic damage. I should also give my usual reminder that people are not necessarily very good judges of what makes them vote. We are not particularly rational creatures and the way people vote at referendums and elections is not a dry comparison of policy offers or facts, but often a mixture of vague feelings, bias and heuristics – so things like a lack of trust in the traditional media and “experts” and a perception that the remain campaign were speaking for an out-of-touch establishment rather than ordinary people were probably also factors in driving the Leave vote.

In short – the factors motivating Leave voters are many and varied and 52% of the voters will, by definition, contain people with many, many different views and priorities. However, every effort to ask Leave voters why they voted to leave found sovereignty AND immigration as the clear big issues.

The Times this morning has the latest YouGov voting intention figures – CON 42%(+3), LAB 28%(-2), LDEM 9%(+1), UKIP 11%(-2). While the size of the lead isn’t quite as large as the seventeen points ICM showed earlier in the week, it’s a another very solid lead for the Conservatives following their party conference, matching the lead May had at the height of her honeymoon. Full tabs are here.

While I’m here I’ll add a quick update on two other recent YouGov polls. First some new London polling, which shows extremely positive ratings for Sadiq Khan. 58% of people think he is doing well as London mayor, only 14% think he is doing badly. Mayors of London seem to get pretty good approval ratings most of the time (both Ken Livingstone and Boris Johnson normally enjoyed positive ratings), I don’t know if that’s down to the skills of the individual politicians who have held the job so far or whether the public judge them by different standards to Westminster politicians. Never the less, it’s a very positive start for Khan, with net positive approval ratings among supporters of all parties except UKIP. Full tabs are here.

Finally, since the subject keeps popping up, some polling on the Royal Yacht. The public oppose replacing the Royal Yacht with a newly commissioned vessel by 51% to 25%. They would also oppose recommissioning the old Royal Yacht Brittania, but by a smaller margin (42% opposed, 31% support). The argument that the cost of the Yacht would be justified by the its role in promoting British trade and interests oversees does not find favour with the general public – 26% think the cost can be justified, 57% think it cannot. Full results are here.