Or at least, questions that should be treated with a large dose of scepticism. My heart always falls when I see questions like those below in polls. I try not to put up posts here just saying “this poll is stupid, ignore it” but sometimes it’s hard. As ever, I’d much rather give people the tools to do it themselves, so here are some types of question that do crop up in polls that you really should handle with care.

I have deliberately not linked to any examples of these in polls, though they are sadly all too easy to find. It’s not an attempt to criticise any polling companies or polls in particular and only he who is without sin should cast the first stone, I’m sure you can find rather shonky questions from all companies and I’m sure I’ve written surveys myself that committed all of the sins below. Note that these are not necessarily misleading questions or biased questions, there is nothing unethical or wrong with asking them, they are just a bit rubbish, and often lead to simplistic or misleading interpretations – particularly when they are asked in isolation, rather than part of a longer poll that properly explores the issues. It’s the difference between a question that you’d downright refuse to run if a client asked for it, and a question that you’d advise a client could probably be asked much better. The point of this article, however (while I hope it will encourage clients not to ask rubbish questions) is to warn you, the reader of research, when a polling question really should be read with caution.

Is the government doing enough of a good thing?

A tricky question. This is obviously trying to gauge a very legitimate and real opinion – the public do often feel that the government hasn’t done enough to sort out a problem or issue. The question works perfectly well it is something where there are two sides to the question and the question is intrinsically one of balance, where you can ask if people think the government has not done enough, or gone too far, or got the balance about right. The problem is when the aim is not contentious, and it really is a question of doing enough – stopping tax evasion, or cutting crime, for example. Very few people are going to think that a government has done too much to tackle tax evasion, that they have pushed crime down too low (“Won’t someone think of the poor tax evaders?”, “A donation of just £2 could buy Fingers McStab a new cosh”), so the question is set up from the beginning to fail. The problems can be alleviated a bit with wording like “Should be doing more” vs “Has done all they reasonably can be expected to do”, but even then you should treat questions like this with some caution.

Would you like the government to give you a pony?*
(*Hat tip to Hopi Sen)

Doesn’t everyone like nice things? There is nothing particularly wrong with questions like this where the issue at question is something controversial and something that people might disagree with. It’s perfectly reasonable to ask whether the government should be spending money on introducing high speed rail, or shooting badgers, or whatever. The difficulty comes when you are asking about something that is generally seen as a universal good – what are essentially valence issues. Should the government spend extra on cutting crime, or educating children, or helping save puppies from drowning? Questions like this are meaningless unless the downside is there as well – “would you like the government to buy you a pony if it meant higher taxes?“, “would you like the government to buy you a pony or could the money be better spent elsewhere?

How important is worthy thing? Or How concerned are you about nasty thing?

Asking if people support or oppose policies, parties or developments is generally pretty straightforward. Measuring the salience of issues is much harder, because you run into problems of social desirability bias and taking issues out of context. For example, in practical terms people most don’t actually do much about third world poverty. Ask people if they care about children starving to death in Africa you’d need a heart of stone to say that you really don’t. The same applies to things that sound very important and worthy, people don’t want to sound ignorant and uninterested by saying they don’t really care much. If you ask about whether people care about, are concerned about or think an issue is important they will invariably say they do care, they are concerned and it is important. The rather more pertinent question that is not always asked is whether it is important when considered alongside all the other important issues of the day. The best way of measuring how important people think an issue is will always be to give them a list of issues and pick out those they consider most important (or better, just give them an empty text box and ask them what issues are important to them).

Will policy X make you more likely to vote for this party?

This is a very common question structure, and probably one I’ve written more rude things about on this site than any other type of question. There are contexts where it can work, so long as it is carefully interpreted and is asked about a factor that is widely acknowledged to be a major driver of voting intention. For example, it’s sometimes used to ask if people would be more or less likely to vote for a party if a different named politician was leader (though questions like that have their own issues).

It becomes much more problematic when it is used as a way of showing an issue is salient. Specifically, it is often used by campaigning groups to try and make out that whatever issue they are campaigning about will have an important impact on votes (and, therefore, MPs should take it seriously or their jobs may be at risk). This is almost always untrue. Voting intention is overwhelmingly driven by big brush themes, party identification, perceptions of the party leaders, perceived competence on the big issues like the economy, health or immigration. It is generally NOT driven by specific policy issues on low salience issues.
However, if you ask people directly about the impact of specific policy issue on a low salience issue, and whether it would make them more or less likely to vote for a party, they will normally claim it does. This is for a number of reasons. One is that you are taking that single issue and giving it false prominence, when it reality it would be overshadowed by big issues like the economy, health or education. The second is that people tend to just use the question as a way of signalling if they like a policy or not, regardless of whether it would actually change their vote. The third is that it takes little account of current voting behaviour – you’ll often find the people saying a policy makes them more likely to vote Tory is made up of people voting Tory anyway, people saying a policy makes them less likely to vote Tory are people who wouldn’t vote Tory if hell froze over.

There are ways to try and get round this problem – in the past YouGov used to offer “Makes no difference – I’d vote party X anyway” and “Makes no difference – I wouldn’t vote for party X” to try and get rid of all those committed voters whose opinion wouldn’t actually change. In some of Lord Ashcroft’s polling he’s given people the options of saying they support a policy and it might change their vote, or that they’d support it but it wouldn’t change their vote. The best way I’ve come up with doing it is to give people a long list of issues that might influence their vote, getting them to tick the top three or four, and only then asking people whether the issue would make them more or less likely to vote for a party (like we did here for gay marriage, for example). This tends to show that many issues have little or no effect on voting intention, which is rather the point.

Should the government stop and think before going ahead with a policy?

This is perhaps the closest I’ve seen to a “when did you stop beating your wife” question in recent polls. Obviously it carries the assumption that the government has not already stopped to consider the implications of policy. Matthew Parris once wrote about a rule of thumb on understanding political rhetoric, saying that if the opposite of a political statement was something that no one could possibly argue for, the statement itself was meaningless fluff. So a politician arguing for better schools is fluff, because no one would ever get up to the podium to argue for worse schools. These sort of questions fall into the same trap – no one would argue the opposite, that the best thing for the government to do is to pass laws willy-nilly without regard for consequences, so people will agree to the statement in regard of almost any subject. It does NOT necessarily indicate opposition to or doubt about the policy in question, just a general preference for sound decision making.

Do you agree with pleasant uncontentious thing, and that therefore we should do slightly controversial thing?

Essentially the problem is one of combining two statements together within an agree disagree statement, and therefore not giving people the chance to agree with one but not the other. For example “Children are the future, and therefore we should spend more on education” – people might well agree that children are the future (in fact, it’s relatively hard not to), but might not agree with the course of action that the question suggests is a natural consequence of this.

How likely are you to do the right thing? or Would you signify your displeasure at a nasty thing?

Our final duff question falls into the problem of social desirability bias. People are more likely to say they’ll do the right thing in a poll than they are to do it in real life. This is entirely as you’d expect. In real life the right thing is sometimes hard to do. It might involve giving money when you really don’t have much to spare, or donating your time to volunteer, or inconveniencing yourself by boycotting something convenient or cheap in favour of something ethical. Answering a poll isn’t like that, you just have to tick the box saying that you would probably do the right thing. Easy as pie. Any poll you see where you see loads of people saying they’d volunteer to do something worthwhile, boycott something else, or give money to something worthy, take with a pinch of salt.


The BBC have commissioned a very rare creature – a local government voting intention poll for a single council, in this case a ComRes poll of Brighton and Hove. The reason, naturally enough, is because of Brighton’s status as being the only Green party council in the country. The poll does not bode well for it remaining that way – it shows the Green party down by about 10 points since the local elections in 2011, Labour up by about 7 points. The figures I have for the 2011 vote in Brighton & Hove are slightly different from those used by the BBC, probably due to dealing with multi-member seats differently, but either way it doesn’t show the Greens doing well. Of course, just as Westminster polls are snapshots of the current position, not predictions of what will happen when the election does roll round, the same applies to local elections.

The poll did NOT ask how people in Brighton and Hove would vote at a general election, so we can’t conclude from it whether or not Caroline Lucas is in trouble of losing her own seat.

Meanwhile the twice-weekly Populus poll is out today and has topline figures of CON 34%, LAB 37%, LDEM 14%, UKIP 8%. The three point lead is at the lower end of Populus’s typical range, but perfectly explicable by the normal margin of error. Full tabs are here


-->

The weekly YouGov poll for the Sunday Times has topline figures of CON 33%, LAB 39%, LDEM 10%, UKIP 11%. Most of the rest of the poll asked about the “plebgate” row.

As various questions about Plebgate have continued to surface public opinion has moved in favour of Andrew Mitchell, albeit, not by that much. Back in December 2012 people were pretty evenly split over whether they believed Mitchell (31%) or the police (28%), now Mitchell is clearly more widely believed (37%) than the police are (27%). Back in December 43% thought Mitchell probably did call the officer a “pleb”, 34% thought he probably didn’t. The figures are now 40% think he did, 38% think he did not. On every question there are lots of don’t knows: remember most ordinary people will not be following the detailed ins and outs of the story!

30% of people think that there was probably a deliberate attempt by police to stitch up Mitchell, 21% think he was probably wrongly accused but through a genuine misunderstanding rather than a conspiracy, 24% that he was rightly accused and the police were just telling the truth. Despite the growing doubts about what he said, still only 29% of people think he should be offered a new government job (perhaps because many people think swearing at police officers should prevent him being re-instated even if he didn’t say “pleb”!)

22% of people say that “plebgate” has made them trust the police less, though the tracking questions don’t really tell the same story. 66% of people say they trust ordinary police officers (14% a great deal, 52% a fair amount), 48% say they trust senior police officers. Both are significantly lower than when YouGov started asking the questions back in 2003 (when 82% trusted normal officers and 72% senior officers), but not significantly lower than we’ve seen for the last year or two – the real damage appears to have been done before plebgate.


ComRes’s monthly online poll for the Indy on Sunday and Sunday Mirror has topline figures of CON 32%(+4), LAB 35%(-1), LDEM 9%(-1), UKIP 16%(-1). Changes are from their poll a month ago, conducted just after the Lib Dem conference. The 32% for the Conservatives is the highest ComRes have shown in their online polls since January, and the three point Labour lead the lowest since before the omnishambles budget in 2012 (in recent months ComRes’s online polls have tended to show lower support for the Conservatives than their phone polls). Tabs are here.

In contrast the fortnightly Opinium poll for the Observer has no narrowing, with topline figures of CON 27%(-2), LAB 38%(+2), LDEM 9%(+2), UKIP 17%(+2), popping Labour back into a double-digit lead for the first time since July for Opinium.

There have been some apparently conflicting polls in recent weeks – some like MORI and ComRes showing things narrowing sharply, some like Opinium and Survation still showing double point leads, some like Populus showing all remaining steady. Remember that all polls have a margin of error, all are unavoidably subject to sample error, the ebb and flow of random chance. Don’t focus too much on individual polls, look at the broad trend, which is a Labour lead in the mid-single digits. It’s certainly down from the bigger Labour leads we were seeing this Spring, but more recent polls are not showing any strong trend either way.

ComRes’s poll also had some quite fun split sample questions, seeing what effect mentioning the party leaders in questions and attributing policies had on answers. Party preference on economic issues showed no real difference between asking if people preferred Conservative or Labour or Cameron or Miliband (as YouGov and ComRes have found before, the more general finding was that Labour lead on “cost of living” type measures, the Conservatives on more general economic competence). They also asked about a couple of policies, attributing them to David Cameron and Ed Miliband in one half of the sample, presenting the bare policies to the other half. On the two Conservative policies it made no difference mentioning David Cameron’s name, on the energy price question support was six percentage points lower when Ed Miliband was mentioned… interesting, but it’s only one data point (and the policies were very high-profile ones that presumably lots of people in the control group knew were associated with Ed Miliband and David Cameron respectively).


Saturday night polls

I’m out tonight, but for those who aren’t you can expect to see a new ComRes for the Sunday Indy and Sunday Mirror, the fortnightly Opinium poll for the Observer and the regular weekly YouGov poll for the Sunday Times.