The monthly ComRes online poll for the Indy on Sunday and Sunday Mirror is out today and has topline figures of CON 40%(-2), LAB 29%(+2), LDEM 7%(nc), UKIP 16%(+1), GRN 3%(nc). The changes since last month are likely just a reversion to the mean – the previous poll was the one giving the Conservatives a fifteen point lead that got lots of people rather overexcited. Full tabs are here.

If that was last month’s Twitter overexcitement, this month everyone is getting too excited about the Scottish crossbreak in the ComRes poll which has the Conservative party ahead of Labour. Don’t read anything into this: regional crossbreaks in voting intention polls are best ignored. With a sample size of 156 people the margin of error of each figure is about 8 points. On top of that GB polls are generally sampled and weighted to GB targets, so the figures within crossbreaks may be a bit off.

There aren’t many recent polls of Westminster voting intention in Scotland – at the current stage in the electoral cycle most Scottish polls are asking exclusively about the Holyrood elections. Looking at those figures there are some showing Labour and the Conservatives quite close in support (the latest MORI and YouGov polls had them within two points of each other), there are other companies (like TNS) showing a bigger gap. There haven’t been any proper Scottish polls showing the Conservatives ahead of Labour in Holyrood intentions.

Of course, it’s possible that the Conservatives are doing better than Labour in Westminster voting intentions in Scotland – we haven’t had a recent poll – but I expect this one is just a case of random noise from small sample sizes. Looking at other recent GB polls YouGov had Labour ahead of the Conservatives in Scotland, so did ICM. MORI had the Conservatives ahead. I expect the true position is that they have quite similar support, meaning that in small crossbreaks random chance will spit out some with more Tories than Labour, some with more Labour than Tories.


What Went Wrong

Today YouGov have put out their diagnosis of what what wrong at the election – the paper is summarised here and the full report, co-authored by Doug Rivers and myself, can be downloaded here. As is almost inevitable with investigations like this there were lots of small issues that couldn’t be entirely ruled out, but our conclusions focus upon two issues: the age balance of the voters in the sample and the level of political interest of people in the sample. The two issues are related – the level of political interest in the people interviewed contributed to the likely voters in the sample being too young. There were also too few over seventies in the sample because YouGov’s top age band was 60+ (meaning there were too many people aged 60-70 and too few aged over 70).

I’m not going to go through the whole report here, but concentrate upon what I think is the main issue – the problems with how politically interested people who respond to polls are and how that impacts on the age of people in samples. In my view it’s the core issue that caused the problems in May, it’s also the issue that is more likely to have impacted on the whole industry (different pollsters already have different age brackets) and the issue that more challenging to solve (adjusting the top age bracket is easily done). It’s also rather more complicated to explain!

People who take part in opinion polls are more interested in politics than the average person. As far as we can tell that applies to online and telephone polls and as response rates have plummeted (the response rate for phone polls is somewhere around 5%) that’s become ever more of an issue. It has not necessarily been regarded as a huge issue though – in polls about the attention people pay to political events we have caveated it, but it has not previously prevented polls being accurate in measuring voting intention.

The reason it had an impact in May is that the effect, the skew towards the politically interested, had a disproportionate effect on different social groups. Young people in particular are tricky to get to take part in polls, and the young people who have taken part in polls have been the politically interested. This, in turn, has skewed the demographic make up of likely voters in polling samples.

If the politically disengaged people within a social group (like an age band, or social class) are missing from a polling sample then the more politically engaged people within that same social group are weighted up to replace them. This disrupts the balance within that group – you have the right number of under twenty-fives, but you have too many politically engaged ones, and not enough with no interest. Where once polls showed a clear turnout gap between young and old, this gap has shrunk… it’s less clear whether it has shrunk in reality.

To give an concrete example from YouGov’s report, people who are under the age of twenty-five make up about 12% of the population, but they are less likely than older people to vote. Looking at the face-to-face BES survey, 12% of the sample would have been made up of under twenty-five, but only 9.1% of those people who actually cast a vote were under twenty-five. Compare this to the YouGov sample – once again, 12% of the sample would have been under twenty-five, but they were more interested in politics, so 10.7% of YouGov respondents who actually cast a vote were under twenty-five.

Political interest had other impacts too – people who paid a lot of interest to politics behaved differently to those who paid little attention. For example, during the last Parliament one of the givens was that former Liberal Democrat voters were splitting heavily in favour of Labour. Breaking down 2010 Liberal Democrat voters by how much attention they pay to politics though shows a fascinating split: 2010 Lib Dem voters who paid a lot of attention to politics were more likely to switch to Labour; people who voted Lib Dem in 2010 but who paid little attention to politics were more likely to split to the Conservatives. If polling samples had people who were too politically engaged, then we’d have too many LD=>Lab people and too few LD=>Con people.

So, how do we put this right? We’ll go into the details of YouGov’s specific changes in due course (they will largely be the inclusion of political interest as a target and updating age, but as ever, we’ll test them top to bottom before actually rolling them out on published surveys). However, I wanted here to talk about the two broad approaches I can see going forward for the wider industry.

Imagine two possible ways of doing a voting intention poll:

  • Approach 1 – You get a representative sample of the whole adult population, weight it to the demographics of the whole adult population, then filter out those people who will actually vote, and ask them who they’ll vote for.
  • Approach 2 – You get a representative sample of the sort of people who are likely to vote, weight it to the demographics of people who are likely to vote, and ask them who they’ll vote for.

Either of these methods would, in theory, work perfectly. The problem is that pollsters haven’t really doing either of them. Lots of people who don’t vote don’t take part in polls either, so actually pollsters end up with a samples of the sort of people who are likely to vote, but then weight them to the demographics of all adults. This means the final samples of voters over-represent groups with low turnouts.

Both methods present real problems. May 2015 illustrated the problems pollsters face in getting the sort of people who don’t vote in their samples. However, approach two faces an equally challenging problem – we don’t know the demographics of the people who are likely to vote. The British exit poll doesn’t ask demographics, so we don’t have that to go on, and even if we base our targets on who voted last time, what if the type of people who vote changes? While British pollsters have always taken the first approach, many US pollsters have taken a route closer to approach two and have on occasion come unstuck on that point – assuming an electorate that is too white, or too old (or vice-versa).

The period following the polling debacle of 1992 was a period of innovation. Lots of polling companies took lots of different approaches and, ultimately, learnt from one another. I hope there will be a similar period now – to follow John McDonnell’s recent fashion of quoting Chairman Mao, we should let a hundred flowers bloom.

From a point of view of an online pollster using a panel, the ideal way forward for us seems to be to tackle samples not having enough “non-political” people. We have a lot of control over who we recruit to samples so can tackle it at source: we record how interested in politics our panellists say they are, and add it to sampling quotas and weights. We’ll also put more attention towards recruiting people with little interest in politics. We should probably look at turnout models too, we mustn’t get lots of people who are unlikely to vote in our samples and then assume they will vote!

For telephone polling there will be different challenges (assuming, of course, that they diagnose similar causes – they may find the causes of their error was something completely different). Telephone polls struggle enough as it is to fill quotas without also trying to target people who are uninterested in politics. Perhaps the solution there may end up being along the second route – recasting quotas and weights to aim at a representative sample of likely voters. While they haven’t explicitly gone down that route, ComRes’s new turnout model seems to me to be in that spirit – using past election results to create a socio-economic model of the sort of people who actually vote, and then weighting their voting intention figures along those lines.

Personally I’m confident we’ve got the cause of the error pinned down, now we have to tackle getting it right.


-->

Oldham by-election

One day I’m going to write a generic post by-election post labelled (insert constituency name here) that I can repost after every by-election. Until that day, here’s my traditional answer to what last night’s by-election tells us about the national political picture: not much.

By-election are extremely strange beasts. They take place in a single constituency that may be completely untypical of the country as a whole, they normally have no impact at all upon who will be running the country the next day, they have far greater campaigning intensity than any other election. After every by-election I post the same conclusion – if they show much the same as the national polls suggest they tell us nothing new, if they show something different it’s probably to do with the unique and different circumstances of by-election. In this case the opposition party has held onto a safe seat. This is exactly what we should expect unless they are tanking in the national polls, and Labour aren’t: despite Corbyn’s poor ratings and the constant news stories of Labour infighting their level of support is still pootling along at around their general election share. There is no reason to expect UKIP surges either – in the last Parliament UKIP had soared from 3% to the mid-teens, so almost every by-election saw them surging, but now we are comparing their support to what they got in the 2015 general election, after their breakthrough. This is a good local result for Labour, but doesn’t tell us much new.

That’s not to say it’s not important. By-elections have a significant effect on the political narrative and in that sense this is a very good result for Labour (or, depending on your point of view, for Jeremy Corbyn). If this by-election had gone differently it would have been part of a different narrative, it would have been all about Labour in crisis, their traditional working class support fracturing to UKIP. It would still have been over interpreting a by-election, but it would almost certainly have happened and it’s been avoided. In that sense, it’s an important victory.

A final note about the polling – there wasn’t any (I don’t know whether to be amused or depressed by the handful of comments I’ve seen about it being a another polling failure. Nothing to do with us mate!). By election polling used to be very rare, then in the last Parliament we were suddenly spoilt, with Survation and Ashcroft polls for most By-elections. This time we are back to having no real evidence to go on, to relying on what commentators have been told by the campaigns, what it “feels like” on the doorstep and in vox pops and all that sort of nonsense. I suspect the collective commentariat have got carried away with what would have made an interesting narrative to report, rather than dull old “safe seat held”. It’s a reminder that without any proper polling By-elections can be pretty hard things to call.


YouGov and the Times have some fresh Syria polling tonight, conducted on Monday evening and during the day on Tuesday. It shows a sharp drop in support for airstrikes since YouGov’s polling a week ago, but the overall balance of opinion is still in favour: 48% now support RAF airstrikes against ISIS in Syria, 31% are opposed. A week ago the figures were 59% to 20%.

Some of this may be the fading impact of the Paris attacks, some people recoiling from the reality of intervention. I suspect a lot is also partisan polarisation: there is little movement amongst Conservative voters, but there is a huge turnaround amongst Labour voters. Whereas a week ago 2015 Labour voters broke in favour of airstrikes by 52% to 26%, they have now turned against. Among 2015 Labour voters 42% are now opposed (up 16 points), only 35% now support (down 17). While Jeremy Corbyn’s stance is still at odds with wider public opinion, now both Labour voters and Labour members agree with him: it is his opponents within the PLP who are at odds with the rest of the Labour family.

But if public opinion is moving against intervention, there’s not a sign of it helping Jeremy Corbyn with the wider public, or hurting Conservative support. Corbyn’s own ratings are down – 24% of people now think he is doing well as leader, down from 30% last week; 65% think he is doing badly. Voting intention figures are CON 41%, LAB 30%, LDEM 6%, UKIP 16%.

Peter Kellner’s commentary for the Times is up here.