FAQ: Dealing with don’t knows
In 1992 the British polls famously got it wrong. All the polls showed Labour ahead or the parties neck and neck. In fact the Conservatives had a solid lead. In the post-mortem that followed one of the problems that was identified was the “spiral of silence” or “shy tories”. In short, given the unpopularity of the government some people were embarrassed to admit to a pollster that they would vote Conservative.
ICM pioneered an approach to dealing with this based on re-allocating don’t knows. When they looked at the people who said “don’t know” to voting questions in 1992 they found these people were disproportionately people who had voted Conservative in 1987. They also found a strong correlation between how don’t knows voted at the previous election and how they ended up voting at the next election, as reported in follow up polls. In short, ICM theorised that some of those don’t knows weren’t don’t knows at all, they were people who would actually vote Conservative, but didn’t want to admit it.
ICM’s solution is to take their don’t knows and reallocate them on the assumption that 50% of them will vote for the party they claim they voted for at the previous election (in practice this is done by weighting they all down by 0.50 and reallocating them all). The proportion is based on their previous panel studies. It is important to note that ICM’s explanation for the adjustment – all the stuff about shyness – isn’t actually necessary for it to be correct. It would be just as justified to say that past studies have shown that don’t knows split in favour of the party they previously voted for, so we’ve reallocated a proportion of them on that basis. Populus now also use a similar method, the only difference being that they re-allocate 50% of former Labour and Conservative voters, but only 30% of former Lib Dems and no former “others” at all. Again, this is based on their own callback surveys after the last election.
While ICM’s adjustment originally favoured the Conservative party, it is important to note that it changes over time depending on what sort of voter is saying “don’t know”. I do occassionally get queries about whether ICM’s adjustment to favour the Tories should be looked at again now that Labour are the unpopular party. Well, I’m sure all the pollsters keep things under review, but ICM’s method should be self-correcting. In fact, while it used to favour the Conservatives for many years – since Tony Blair’s second term, it has tending to favour Labour and occassionally the Liberal Democrats.
Moving on from ICM and Populus, Ipsos MORI do not normally carry out any form of re-allocating their don’t knows (though it is worth noting they did do so for their final pre-election poll in 2005), however, they do try and squeeze a voting intention of them if they don’t offer one. Anyone who says don’t know, or refuses to answer is asked a “squeeze question” of “Which party are you most inclined to support?”. These people are given equal weight to those who give a firm intention to vote.
ComRes do something of a cross between MORI and ICM/Populus. They ask a squeeze question of don’t knows – in this case a harsher one than MORI asking how people would vote if they were legally obliged to do so. People who still say they don’t know are reallocated according to which party – if any – they most identify with.
Finally there is YouGov, who simply ignore all don’t knows and include only those respondents who actually give a voting intention. The contrast with the others may not be quite as stark as this sounds – if ICM’s theory of the spiral of silence is true, and some don’t knows are the result of people being unwilling to admit a voting intention they think may be seen as unfashionable or socially unacceptable to a live interviewer, it is likely that they will be more willing to admit it to an inhuman computer screen. This is a strong suggestion of this mode effect on voting intention in the levels of support for the BNP recorded by pollsters in the spring/summer of 2006, when YouGov detected a larger bump in support than polls conducted by live interviewers.
Before we leave how we deal with don’t knows, it is worth noting that, almost by definition, re-allocating don’t knows or squeezing out intentions from them tends to favour the underdog. If a party has a big lead in the polls it is likely that the vast majority of it’s voters from the previous election are still saying they would vote for them, the “don’t knows” will help the less popular party. This is certainly the case with ICM and Populus’s adjustments, which are made very explicit in the data they release. It may well also be part of the explanation as to why YouGov tend to show the largest leads when a party is at its peak – the Brown bounce in 2007 or the huge Conservative leads in Summer 2008: other pollsters have methods which, rightly or wrongly, dampen down big leads a bit.