This morning’s YouGov poll for the Sun had topline figures of CON 33%, LAB 37%, LDEM 10%, UKIP 13%. A Labour lead of four points and UKIP at 13%. UKIP are lower than yesterday, but worth noting that they’ve been averaging at around 13% since the second Clegg-Farage debate, compared to around 11% earlier in March.

YouGov also did a Maria Miller question yesterday, now obviously out of date, but which raised some interesting methodological questions. Asking a fair question is often a matter of giving the minimal amount of information necessary in order to get a response. The more information you give, the more you risk leading respondents, the more you risk essentially creating the perception of a public opinion that isn’t really out there. If pollsters ask the public a question, a fair proportion of people will answer, regardless of whether they actually have any real views at all (The classic example of this is the Public Affairs Act, which doesn’t exist, but which 18% of people are willing to express an opinion about.)

When a politician is in a scrape and pollsters ask if they should resign you normally end up prefacing the question with “Harriet Jones has been accused of killing kittens…”, when the respondent might previously have been unaware of the cat murdering rumours or of Harriet Jones. The very fact you are asking questions about Harriet Jones implies there is a fuss about her, and you can never tell what proportion of people would have said she should resign anyway, probably for the crime of being from the wrong party.

Yesterday YouGov asked about Maria Miller resigning, but in a different way. They didn’t mention expenses at all (if people had seen the story, they’ll have seen the story, if they haven’t, they haven’t) and they hid Maria Miller amongst lots of other politicians and asked, for each one, if they should resign or not. 63% of people said that Maria Miller should resign, only 9% said she should not, which is pretty unambiguous. However, 52% also think Nick Clegg should resign, 47% Michael Gove, 46% Ed Miliband, 37% George Osborne. The lowest was for Theresa May, but 30% of people still think she should resign. It seems whoever you ask about a fair chunk want them to resign, though note that in the case of Maria Miller people from all parties wanted to see the back of her, the other cases were mostly political opponents saying politicians from a party they dislike should go.

Two things to take away from this. One, the Maria Miller story was noticed. People did still think she should resign without any prompting in the question about what she was accused of. It still doesn’t imply it will have any effect on voting intention at all (people view these things through the prism of their pre-existing political support, and as I wrote yesterday, looking back it has been incredibly rare for events like this to have any measurable impact on voting intention) but it did get noticed, or her figures would have been the same as everyone else’s.

Secondly, do be careful about “should X resign” questions when you see them asked in isolation. Lots of people will say a politician from a government they are opposed to should resign anyway, regardless of the scandal de jour. Perhaps it’s worth paying special attention to the answers from supporters of the politician’s own party.


Populus’s twice weekly poll today has topline figures of CON 33%, LAB 36%, LDEM 9%, UKIP 15% (tabs here. These figures are on the back of a slight tweak in Populus’s methodology. Previously they weighted party identification to figures drawn from the 2010 British Social Attitudes survey, which normally resulted in heavily downweighting UKIP and meant Populus tended to show one of the lowest levels of UKIP support and some of the highest levels of Lib Dem support.

Using the new method Populus have factored in alternative sources for their party ID targets, with the effect that they are weighting the Lib Dems and Labour to slightly lower figures, UKIP and no party to slightly higher figures. Hence while this is a low Labour lead compared to most of Populus’s polls over recent weeks, some of that is down to the method change: using Populus’s old weightings today’s figures would have been Con 32, Lab 37, Lib Dem 11, UKIP 12.

Also out today we have a new YouGov Scotland poll in the Sun. Referendum voting intentions are YES 34%, NO 52%. Yes is up one point since YouGov’s last poll, No unchanged. By itself the change is insignificant, but looking at the wider trend of polls on the Scottish referendum there is a general trend of a small shift towards YES since the publication of the white paper. Past Scottish referendum polls are collected here.


Regular readers may recall a YouGov poll of Welsh voting intentions back in July for Roger Scully’s elections in Wales site. It produced some rather strange results – not least because it had Labour at 46% in the Welsh Assembly constituency vote (perfectly reasonable), but only 25% in the Assembly regional vote, which seemed implausible. In 2011 Labour’s vote was 5 points lower in the regional vote, but 21 points lower seems extremely unlikely. This had happened several times in YouGov’s Welsh polls in the last couple of years, apparently starting since YouGov changed their blurb at the start of Welsh polls in 2012. The suggestion was that people who might not be too familiar with the voting system were misinterpreting the question, and instead of giving a regional vote, were giving a second preference.

Anyway, as Roger explains here, YouGov did a bit of testing to find out. Using a three-way split sample they tested three different wordings. The first was the wording that YouGov used to use pre-2012:

“If there were an election to the National Assembly for Wales tomorrow, and thinking about the constituency vote, how would you vote? And thinking about the regional or party vote for the National Assembly for Wales, which party list would you vote for?”

The second was the wording YouGov have been using since 2012 – note the phrase “your second vote” in there:

“In elections to the National Assembly for Wales you have two votes. One is for an individual member of the Assembly – or AM – for your constituency. The second is for a party list for your region. If there were a National Assembly for Wales election tomorrow, which party would you vote for in your constituency? Now thinking about your second vote, for a party list in your region, which party would you vote for?”

The third group got some new wording, very similar to the current one, but taking away the words “second vote”:

“In elections to the National Assembly for Wales you have two votes. One is for an individual member of the Assembly – or AM – for your constituency. The second is for a party list for your region. If there were a National Assembly for Wales election tomorrow, which party would you vote for in your constituency? Now thinking about the regional or party vote for the National Assembly for Wales, which party list would you vote for?”

As you’d expect, the different wordings made virtually no difference to how people answered the constituency vote question, but it made a massive difference to how people answered the regional vote question:

Old wording (no explanation)- CON 18%, LAB 39%, LDEM 4%, PC 21%, UKIP 9%
Current wording (“second vote”) – CON 16%, LAB 19%, LDEM 8%, PC 24%, UKIP 20%
New wording (“regional vote”) – CON 18%, LAB 35%, LDEM 5%, PC 21%, UKIP 14%

Using the current “second vote” wording there was once again an implausible 19 point difference between Labour’s constituency and regional vote. Using the old wording, or the new wording that takes away the phrase “second vote” the gap between Labour’s constituency and regional vote becomes a far more realistic 3 to 5 points. Going forward, YouGov will be using the new wording, using the words “regional or party vote”, rather than “second vote”.

Note, for the record, that these figures aren’t comparable to normal Welsh polls for sampling reasons (basically a proper Welsh poll will have a sample targeted at Welsh demographics, this was all about the comparisons, so it just went to a big lump of Welsh respondents, split three ways).


This morning’s YouGov/Sun poll has topline voting intention figures of CON 32%, LAB 39%, LDEM 10%, UKIP 12%. It also asked how people across Britain as a whole would vote if they could vote in the Scottish referendum – 22% would vote for Scottish independence, 55% would vote against, so more opposed to independence than Scotland itself (obviously the poll included a Scottish cross-break, but I’d caution against reading too much into that – stick to proper, bespoke Scottish polls for that, I suspect there will be plenty along in the aftermath of the white paper). Full tabs for the YouGov poll are here.

There is also a new Survation poll of Thanet South (tabs here), the first of a series of constituency polls commissioned by Alan Bown, a major UKIP donor, presumably of seats they see at potentially good for UKIP. The rest are likely to come out in December, but this one is out early because of Laura Sandys announcement that she’s too retire (though the poll itself was mostly done before that).

Topline voting intention figures in the seat are CON 28%(-20), LAB 35%(+4), LDEM 5%(-10), UKIP 30%(+24). Thanet, of course, was one of the areas where UKIP did particularly well in the local elections and is seen as a seat where Nigel Farage might stand at the next election. Note that there are some methodological changes from Survation’s past constituency polls. Previously they’ve weighted constituency polls by 2010 past vote and reallocated don’t knows based on past vote, in the same way they do for their national polls (though for practical reasons they do national polls online, but local polls by phone). For the latest polls they’ve changed method – no longer using political weighting, and not reallocating don’t knows. This is apparently part of a general review of how they do constituency polling, rather than something for this poll in particular.

Regular readers will be familiar with the debate over past vote weighting. Most companies (the primary exceptions being MORI and Opinium) weight their samples by both demographics, and by a political variable, normally how people voted at the last election, to ensure the sample is properly politically representative. While straightforward in theory, in practice this is complicated by the fact that poll respondents are not always very good at actually recalling how they voted at the last election (a phenomenon known as “false recall”). Companies that weight by past vote like ICM and ComRes therefore use a formula to estimate the level of “false recall” and account for that in their weighting schemes. Other companies, like MORI, take the view that false recall is so difficult to estimate and so potentially volatile that it renders past vote as unsuitable for weighting and risks cancelling out genuine volatility amongst the electorate, and therefore reject it completely.

In the case of the Survation poll of Thanet South, of the respondents who said they voted in 2010, about 41% said they voted Conservative, 38% Labour, 10% Lib Dem and 11% UKIP – so a three percent Conservative majority, when actually Laura Sandys had a seventeen percent majority. It underlines both the potential risk from not using political weighting, and the difficult choices that companies that do use it face – some of that difference will be false recall, but I suspect much of it is a sample that too Labour. Dividing one from the other is the challenge.


Or at least, questions that should be treated with a large dose of scepticism. My heart always falls when I see questions like those below in polls. I try not to put up posts here just saying “this poll is stupid, ignore it” but sometimes it’s hard. As ever, I’d much rather give people the tools to do it themselves, so here are some types of question that do crop up in polls that you really should handle with care.

I have deliberately not linked to any examples of these in polls, though they are sadly all too easy to find. It’s not an attempt to criticise any polling companies or polls in particular and only he who is without sin should cast the first stone, I’m sure you can find rather shonky questions from all companies and I’m sure I’ve written surveys myself that committed all of the sins below. Note that these are not necessarily misleading questions or biased questions, there is nothing unethical or wrong with asking them, they are just a bit rubbish, and often lead to simplistic or misleading interpretations – particularly when they are asked in isolation, rather than part of a longer poll that properly explores the issues. It’s the difference between a question that you’d downright refuse to run if a client asked for it, and a question that you’d advise a client could probably be asked much better. The point of this article, however (while I hope it will encourage clients not to ask rubbish questions) is to warn you, the reader of research, when a polling question really should be read with caution.

Is the government doing enough of a good thing?

A tricky question. This is obviously trying to gauge a very legitimate and real opinion – the public do often feel that the government hasn’t done enough to sort out a problem or issue. The question works perfectly well it is something where there are two sides to the question and the question is intrinsically one of balance, where you can ask if people think the government has not done enough, or gone too far, or got the balance about right. The problem is when the aim is not contentious, and it really is a question of doing enough – stopping tax evasion, or cutting crime, for example. Very few people are going to think that a government has done too much to tackle tax evasion, that they have pushed crime down too low (“Won’t someone think of the poor tax evaders?”, “A donation of just £2 could buy Fingers McStab a new cosh”), so the question is set up from the beginning to fail. The problems can be alleviated a bit with wording like “Should be doing more” vs “Has done all they reasonably can be expected to do”, but even then you should treat questions like this with some caution.

Would you like the government to give you a pony?*
(*Hat tip to Hopi Sen)

Doesn’t everyone like nice things? There is nothing particularly wrong with questions like this where the issue at question is something controversial and something that people might disagree with. It’s perfectly reasonable to ask whether the government should be spending money on introducing high speed rail, or shooting badgers, or whatever. The difficulty comes when you are asking about something that is generally seen as a universal good – what are essentially valence issues. Should the government spend extra on cutting crime, or educating children, or helping save puppies from drowning? Questions like this are meaningless unless the downside is there as well – “would you like the government to buy you a pony if it meant higher taxes?“, “would you like the government to buy you a pony or could the money be better spent elsewhere?

How important is worthy thing? Or How concerned are you about nasty thing?

Asking if people support or oppose policies, parties or developments is generally pretty straightforward. Measuring the salience of issues is much harder, because you run into problems of social desirability bias and taking issues out of context. For example, in practical terms people most don’t actually do much about third world poverty. Ask people if they care about children starving to death in Africa you’d need a heart of stone to say that you really don’t. The same applies to things that sound very important and worthy, people don’t want to sound ignorant and uninterested by saying they don’t really care much. If you ask about whether people care about, are concerned about or think an issue is important they will invariably say they do care, they are concerned and it is important. The rather more pertinent question that is not always asked is whether it is important when considered alongside all the other important issues of the day. The best way of measuring how important people think an issue is will always be to give them a list of issues and pick out those they consider most important (or better, just give them an empty text box and ask them what issues are important to them).

Will policy X make you more likely to vote for this party?

This is a very common question structure, and probably one I’ve written more rude things about on this site than any other type of question. There are contexts where it can work, so long as it is carefully interpreted and is asked about a factor that is widely acknowledged to be a major driver of voting intention. For example, it’s sometimes used to ask if people would be more or less likely to vote for a party if a different named politician was leader (though questions like that have their own issues).

It becomes much more problematic when it is used as a way of showing an issue is salient. Specifically, it is often used by campaigning groups to try and make out that whatever issue they are campaigning about will have an important impact on votes (and, therefore, MPs should take it seriously or their jobs may be at risk). This is almost always untrue. Voting intention is overwhelmingly driven by big brush themes, party identification, perceptions of the party leaders, perceived competence on the big issues like the economy, health or immigration. It is generally NOT driven by specific policy issues on low salience issues.
However, if you ask people directly about the impact of specific policy issue on a low salience issue, and whether it would make them more or less likely to vote for a party, they will normally claim it does. This is for a number of reasons. One is that you are taking that single issue and giving it false prominence, when it reality it would be overshadowed by big issues like the economy, health or education. The second is that people tend to just use the question as a way of signalling if they like a policy or not, regardless of whether it would actually change their vote. The third is that it takes little account of current voting behaviour – you’ll often find the people saying a policy makes them more likely to vote Tory is made up of people voting Tory anyway, people saying a policy makes them less likely to vote Tory are people who wouldn’t vote Tory if hell froze over.

There are ways to try and get round this problem – in the past YouGov used to offer “Makes no difference – I’d vote party X anyway” and “Makes no difference – I wouldn’t vote for party X” to try and get rid of all those committed voters whose opinion wouldn’t actually change. In some of Lord Ashcroft’s polling he’s given people the options of saying they support a policy and it might change their vote, or that they’d support it but it wouldn’t change their vote. The best way I’ve come up with doing it is to give people a long list of issues that might influence their vote, getting them to tick the top three or four, and only then asking people whether the issue would make them more or less likely to vote for a party (like we did here for gay marriage, for example). This tends to show that many issues have little or no effect on voting intention, which is rather the point.

Should the government stop and think before going ahead with a policy?

This is perhaps the closest I’ve seen to a “when did you stop beating your wife” question in recent polls. Obviously it carries the assumption that the government has not already stopped to consider the implications of policy. Matthew Parris once wrote about a rule of thumb on understanding political rhetoric, saying that if the opposite of a political statement was something that no one could possibly argue for, the statement itself was meaningless fluff. So a politician arguing for better schools is fluff, because no one would ever get up to the podium to argue for worse schools. These sort of questions fall into the same trap – no one would argue the opposite, that the best thing for the government to do is to pass laws willy-nilly without regard for consequences, so people will agree to the statement in regard of almost any subject. It does NOT necessarily indicate opposition to or doubt about the policy in question, just a general preference for sound decision making.

Do you agree with pleasant uncontentious thing, and that therefore we should do slightly controversial thing?

Essentially the problem is one of combining two statements together within an agree disagree statement, and therefore not giving people the chance to agree with one but not the other. For example “Children are the future, and therefore we should spend more on education” – people might well agree that children are the future (in fact, it’s relatively hard not to), but might not agree with the course of action that the question suggests is a natural consequence of this.

How likely are you to do the right thing? or Would you signify your displeasure at a nasty thing?

Our final duff question falls into the problem of social desirability bias. People are more likely to say they’ll do the right thing in a poll than they are to do it in real life. This is entirely as you’d expect. In real life the right thing is sometimes hard to do. It might involve giving money when you really don’t have much to spare, or donating your time to volunteer, or inconveniencing yourself by boycotting something convenient or cheap in favour of something ethical. Answering a poll isn’t like that, you just have to tick the box saying that you would probably do the right thing. Easy as pie. Any poll you see where you see loads of people saying they’d volunteer to do something worthwhile, boycott something else, or give money to something worthy, take with a pinch of salt.