YouGov’ daily polling for the Sun has topline figures of CON 40%, LAB 40%, LDEM 9%, Others 10%. It’s the first time YouGov have shown the Lib Dems dropping into single figures, and the first time any pollster has shown them at such a low level since 1997.
I always urge some caution when a poll shows a sharp movement. In recent weeks YouGov have generally been showing the Lib Dems at around about 11%. It was mathematically inevitable that sample error alone would eventually produce a 9% (or indeed a 13% like the one we saw at the weekend). That said, the lowest score for 13 years is worthy of comment and suggests the downwards trend in Lib Dem support is continuing.
Inevitably eyebrows will once again be raised over the sheer spread of Lib Dem scores from different pollsters. ICM was showing 18% for a long time, though they have more recently dropped to 16%, YouGov have tended to have the Lib Dems at around about 11% lately, the other polling companies tend to have them around 14% or 15%. Even putting aside outliers, that a good five point difference. What’s causing it? (If you aren’t interested in geeky polling methodology, you may want to look away now!)
Some of it is clearly identifiable. ICM, who show the highest Lib Dem score, reallocate people who say they don’t know how they will vote based upon the assumption that 50% will end up voting for the party they say they did last time. Currently a lot of former Liberal Democrat voters say they don’t know how they’ll vote, and in ICM’s last two polls this reallocation added 2 points to the Lib Dem score (it would have been 14% without it). Populus also do this reallocation, but at a lower rate for Liberal Democrats. This difference is nothing to worry about – different pollsters are simply measuring slightly different things – YouGov is reporting how people say they would vote tomorrow, ICM and Populus are reporting how people say they would vote tomorrow plus how they think those that didn’t say would vote.
That’s part of the reason, but it leaves at least another 3 point difference to explain. One thing we can rule out is the weighting targets. YouGov weight by party ID rather than past vote so the published figures aren’t comparable, but I checked the sample for this poll and about 26% of respondents who say they voted in 2010 claim they voted Lib Dem (which is pretty typical), so YouGov actually have more people saying they voted Lib Dem in 2010 in their samples than some other pollsters do. For some reason YouGov’s 2010 Lib Dem voters seem to be less likely to be sticking with the Liberal Democrats now.
As to the reasons why, we can only speculate. One possibility is when the weighting data is collected – companies using online panels can ask respondents how they voted at the time of the election then retain that information for weighting future surveys – people still might not report their vote accurately, but at least you can be certain levels of false recall won’t change. Phone pollsters naturally have to ask people to recall their past vote during the survey. If the drop in people supporting the Liberal Democrats is being matched by a drop in the proportion of people reporting to pollsters that they voted Lib Dem in 2010, then there is a risk that pollsters could be failing to account for growing false recall and are weighting them upwards too much (meaning samples would include an increasingly large proportion of the more loyal Lib Dems). Looking at the data from Ipsos-MORI who ask about recalled 2010 vote, but don’t weight by it, there is indeed a clear and rapid downwards trend in the proportion of people reporting they voted Lib Dem in 2010, which suggests this is a possibility.
Another possibility for ICM’s high score is their question wording – they ask how people would vote in “your area”, which could potentially make people think more about their local seat and local MP, which might help the Lib Dems. Personally I don’t think the difference in wording is blatant enough, but you never know (and this would only explain ICM, not other companies). Or, of course, it could be something else to do with the samples. I wrote here about some of the reasons why the pollsters got the Liberal Democrats wrong at the 2010 election – one of which may have been having samples that are too interested in politics and probably too educated (YouGov has since changed sampling to tackle the problem), perhaps it’s something along those lines. Perhaps it is a mode effect of some sort – for some reason answering online making people more likely to say they aren’t voting Lib Dem, or some sampling bias that either online panels or telephone polls are producing… or something completely different.
As to which one is right. Well, naturally I think YouGov are probably correct – if I didn’t, we’d change methods! Equally, I’m sure Martin Boon also thinks ICM are doing things correctly, and so will ComRes, MORI and everyone else. No pollster uses methodology they think is wrong. Something I once heard Martin himself say at a conference was that if you were confident in your methodology and it spat out surprising numbers then once you’d checked everything you eventually just had to trust the numbers. It’s sound advice. It may well be that the big gap disappears with time (ICM and Populus’s way of calculating target weights takes into account long term shifts in past recall, and ICM’s election post-mortem said they’d be reviewing whether they were weighting the Lib Dems too highly)… or it may persist over time and we won’t know for certain until May 2015.
UPDATE: Also worth noting is the government approval rating – the net score was minus 10, the lowest the coalition have had so far.