14 weeks to go

Week four of the year we had the regular YouGov, Ashcroft and Populus polls, the first ComRes telephone poll of the year and the first 2015 GB poll from Survation – the first in a regular series for the Daily Mirror.

YouGov/S Times (23/1/15) – CON 32%, LAB 32%, LDEM 7%, UKIP 15%, GRN 7%
Survation/Mirror (25/1/15) – CON 31%, LAB 30%, LDEM 7%, UKIP 23%, GRN 3%
Populus (25/1/15) – CON 34%, LAB 35%, LDEM 9%, UKIP 13%, GRN 6%
Ashcroft (25/1/15) – CON 32%, LAB 32%, LDEM 6%, UKIP 15%, GRN 9%
ComRes/Indy (25/1/15) – CON 31%, LAB 30%, LDEM 8%, UKIP 17%, GRN 7%
YouGov/Sun (26/1/15) – CON 34%, LAB 33%, LDEM 6%, UKIP 15%, GRN 7%
YouGov/Sun (27/1/15) – CON 34%, LAB 33%, LDEM 7%, UKIP 14%, GRN 7%
YouGov/Sun (28/1/15) – CON 33%, LAB 33%, LDEM 6%, UKIP 16%, GRN 7%
YouGov/Sun (29/1/15) – CON 34%, LAB 34%, LDEM 6%, UKIP 14%, GRN 7%
Populus (29/1/15) – CON 34%, LAB 35%, LDEM 10%, UKIP 14%, GRN 4%

The polls this week continued to show an extremely tight race – every single poll had the two main parties within one point of each other, and unlike last week there were slightly more polls with the Tories ahead than with Labour ahead. The UKPR average though still has figures of CON 32%(nc), LAB 33%(nc), LDEM 8%(nc), UKIP 15%(nc), GRN 6%(-1), as Opinium and ICM polls from last week are still contributing towards the average. For anyone interested in the differences between some of the polls from different companies, I explored them in this post earlier this week.

Welsh polls

There were also two Welsh voting intention polls out this week, the regular YouGov/ITV/University of Cardiff poll and an ICM poll for the BBC. Westminster voting intention figures for the two polls were:

ICM/BBC – CON 21%, LAB 38%, LDEM 7%, Plaid 12%, UKIP 13%, GRN 6%
YouGov/ITV – CON 23%, LAB 37%, LDEM 6%, Plaid 10%, UKIP 16%, GRN 8%

Week four

  • At the beginning of the week there was a lot of froth about UKIP’s NHS policy and the Green party’s policies on membership of extremist groups and a citizen’s income. It’s unlikely that either will make much difference for the simple fact that most people have no idea at all about what their policies are on such issues. For UKIP, the majority of people think they have at least a fairly good idea of what sort of approach they would take on immigration and Europe, but on other subjects people draw a blank. For the Green party 54% think they’ve got some idea what the Green party would do on the environment, but on everything else at least three quarters know nothing. It doesn’t necessarily stop people backing them, as broad perceptions of a party’s values, principle and competence are far more important than specific policies anyway. I suspect that maybe even more the case for parties who have no realistic chance of getting a majority and putting said policies into action.
  • As we passed the 100 days to go mark both Labour and the Conservatives put out new policies, Labour on the NHS, the Conservatives on welfare benefits. The Conservatives headline pledge to reduce the benefit cap to £23,000 was supported by 61% to 25% (including amongst Labour voters), even though people didn’t think it made people look for work. The idea of stopping housing benefit for young people was more divisive – 42% supported the idea, 40% opposed it.
  • The NHS is generally a rock solid issue for Labour anyway – last week they had a thirteen point lead over the Tories on which party people thought would handle the issue the best. Welfare benefits is actually much more contested ground, in the same poll 28% of people thought Labour would handle the issue the best, 28% of people thought the Conservatives would handle the issue the best.

Projections

The latest forecasts from Election Forecast, May 2015 and Elections Etc are below. All are still predicting a hung Parliament, though Election Forecast and May2015 have the Conservatives catching up with Labour after a week of close polls.

Elections Etc – Hung Parliament, CON 282(-1), LAB 280(+2), LD 24(+1), SNP 40(-1), UKIP 3(nc)
Election Forecast – Hung Parliament, CON 283(+5), LAB 285(-1), LD 27(-1), SNP 32(-2), UKIP 2(-1)
May 2015 – Hung Parliament, CON 280(+11), LAB 280(-9), LD 24(-3), SNP 38(nc), UKIP 5(+1)


Polls often give contrasting results. Sometimes this is because they were done at different times and public opinion has actually changed, but most of time that’s not the reason. A large part of the difference between polls showing different results is often simple random variation, good old margin of error. We’ve spoken about that a lot, but today’s post is about the other reason, systemic differences between pollsters (or “house effects”).

Pollsters use different methods, and sometimes those different choices result in consistent differences between the results they produce. One company’s polls, because of the methodological choices they make, may consistently show a higher Labour score, or a lower UKIP score, or whatever. This is not a case of deliberate bias – unlike in the USA there are not Conservative pollsters or Labour pollsters, every company is non-partisan, but the effect of their methodological decisions mean some companies do have a tendency to produce figures that are better or worse for each political party – we call these “house effects”.

2014houseffects

The graph above shows these house effects for each company, based upon all the polls published in 2014 (I’ve treated ComRes telephone and ComRes online polls as if they are separate companies, as they use different methods and have some consistent differences). To avoid any risk of bias from pollsters carrying more or less polls when a party is doing well or badly I work out the house effects by using a rolling average of the daily YouGov poll as a reference point – I see how much each poll departs from the YouGov average on the day when its fieldwork finished and take an average of those deviations over the year. Then I take the average of all those deviations and graph them relative to that (just so YouGov aren’t automatically in the middle). It’s important to note that the pollsters in the middle of the graph are not necessarily more correct, these differences are relative to one another. We can’t tell what the deviations are from the “true” figure, as we don’t know what the “true” figure is.

As you can see, the difference between the Labour and Conservative leads each company show are relatively modest. Leaving aside TNS, who tended to show substantially higher Labour leads than other companies, everyone else is within 2 points of each other. Opinium and ComRes phone polls tend to show Labour leads that are a point higher than average, MORI and ICM tend to show Labour leads that are a point lower than average. Ashcroft, YouGov, ComRes online and Populus tend to be about average. Note I’m comparing the Conservative-v-Labour gap between different pollsters, not the figures for each one. Populus, for example, consistently give Labour a higher score than Lord Ashcroft’s polls do… but they do exactly the same for the Conservatives, so when it comes to the party lead the two sets of polls tend to show much the same.

There is a much, much bigger difference when it comes to measuring the level of UKIP support. The most “UKIP friendly” pollster, Survation, tends to produce a UKIP figure that is almost 8 points higher than the most “UKIP unfriendly” pollster, ICM.

What causes the differences?

There are a lot of methodological differences between pollsters that make a difference to their end results. Some are very easy to measure and quantify, others are very difficult. Some contradict each other, so a pollster may do something that is more Tory than other pollsters, something that is less Tory than other pollsters, and end up in exactly the same place. They may interact with each other, so weighting by turnout might have a different effect on a phone poll from a telephone poll. Understanding the methodological differences is often impossibly complicated, but here are some of the key factors:

Phone or online? Whether polls get their sample from randomly dialling telephone numbers (which gives you a sample made up of the sort of people who answer cold calls and agree to take part) or from an internet panel (which gives you a sample made up of the sort of people who join internet panels) has an effect on sample make up, and sometimes that has an effect on the end result. It isn’t always the case – for example, raw phone samples tend to be more Labour inclined… but this can be corrected by weighting, so phone samples don’t necessarily produce results that are better for Labour. Where there is a very clear pattern is on UKIP support – for one reason or another, online polls show more support for UKIP than phone polls. Is this because people are happier to admit supporting UKIP when there isn’t a human interviewer? Or it is because online samples include more UKIP inclined people? We don’t know

Weighting. Pollsters weight their samples to make sure they are representative of the British population and iron out any skews and biases resulting from their sampling. All companies weight by simple demographics like age and gender, but more controversial is political weighting – using past vote or party identification to make sure the sample is politically representative of Britain. The rights and wrongs of this deserve an article in their own right, but in terms of comparing pollsters most companies weight by past vote from May 2010, YouGov weight by party ID from May 2010, Populus by current party ID, MORI and Opinium don’t use political weighting at all. This means MORI’s samples are sometimes a bit more Laboury than other phone companies (but see their likelihood to vote filter below), Opinium have speculated that their comparatively high level of UKIP support may be because they don’t weight politically and Populus tend to heavily weight down UKIP and the Greens.

Prompting. Doesn’t actually seem to make a whole lot of difference, but was endlessly accused of doing so! This is the list of options pollsters give when asking who people vote for – obviously, it doesn’t include every single party – there are hundreds – but companies draw the line in different places. The specific controversy in recent years has been UKIP and whether or not they should be prompted for in the main question. For most of this Parliament only Survation prompted for UKIP, and it was seen as a potential reason for the higher level of UKIP support that Survation found. More recently YouGov, Ashcroft and ComRes have also started including UKIP in their main prompt, but with no significant effect upon the level of UKIP support they report. Given that in the past testing found prompting was making a difference, it suggests that UKIP are now well enough established in the public mind that whether the pollster prompts for them or not no longer makes much difference.

Likelihood to vote. Most companies factor in respondents likelihood to vote somehow, but using sharply varying methods. Most of the time Conservative voters say they are more likely to vote than Labour voters, so if a pollster puts a lot of emphasis on how likely people are to actually vote it normally helps the Tories. Currently YouGov put the least emphasis on likelihood to vote (they just include everyone who gives an intention), companies like Survation, ICM and Populus weight according to likelihood to vote which is a sort of mid-way point, Ipsos MORI have a very harsh filter, taking only those people who are 10/10 certain to vote (this probably helps the Tories, but MORI’s weighting is probably quite friendly to Labour, so it evens out).

Don’t knows. Another cause of the differences between companies is how they treat people who say don’t know. YouGov and Populus just ignore those people completely. MORI and ComRes ask those people “squeeze questions”, probing to see if they’ll say who they are most likely to vote for. ICM, Lord Ashcroft and Survation go further and make some estimates about those people based on their other answers, generally assuming that a proportion of people who say don’t know will actually end up voting for the party they did last time. How this approach impacts on voting intention numbers depends on the political circumstances at the time, it tends to help any party that has lost lots of support. When ICM first pioneered it in the 1990s it helped the Tories (and was known as the “shy Tory adjustment”), these days it helps the Lib Dems, and goes a long way to explain why ICM tend to show the highest level of support for the Lib Dems.

And these are just the obvious things, there will be lots of other subtle or unusual differences (ICM weight down people who didn’t vote last time, Survation ask people to imagine all parties are standing in the seat, ComRes have a harsher turnout filter for smaller parties in their online polls, etc, etc)

Are they constant?

No. The house effects of different pollsters change over time. Part of this is because political circumstances change and the different methods have different impacts. I mentioned above that MORI have the harshest turnout filter and that most of the time this helps the Tories, but that isn’t set in stone – if Tory voters became disillusioned and less likely to vote and Labour voters became more fired up it could reverse.

It also isn’t consistent because pollsters change methodology. In 2014 TNS tended to show bigger Labour leads than other companies, but in their last poll they changed their weighting in a way that may well have stopped that. In February last year Populus changed their weights in a way that reduced Lib Dem support and increased UKIP support (and changed even more radically in 2013 when they moved from using the telephone to online). So don’t assume that because a pollster’s methods last year had a particular skew it will always be that way.

So who is right?

At the end of the day, what most people asking the question “why are those polls so different” really want to know is which one is right. Which one should they believe? There is rarely an easy answer – if there was, the pollsters who were getting it wrong would correct their methods and the differences would vanish. All pollsters are trying to get things right.

Personally speaking I obviously I think YouGov polls are right, but all the other pollsters out there will think the same thing about the polling decisions they’ve made and I’ve always tried to make UKPollingReport about explaining the differences so people can judge for themselves, rather than championing my own polls.

Occasionally you get an election when there is a really big spread across the pollsters, when some companies clearly get it right and others get it wrong, and those who are wrong change their methods or fade away. 1997 was one of those elections – ICM clearly got it right when others didn’t, and other companies mostly adopted methods like those of ICM or dropped out of political polling. These instances are rare though. Most of the time all the pollsters show about the same thing, are all within the margin of error of each other, so we never really find out who is “right” or “wrong” (as it happens, the contrast between the level of support for UKIP shown by different pollsters is so great that this may be an election where some polls end up being obviously wrong… or come the election the polls may end up converging and all showing much the same. We shall see).

In the meantime, with an impartial hat on all I can recommend is to look at a broad average of the polls. Sure, some polls may be wrong (and it’s not necessarily the outlying pollster showing something different to the rest – sometimes they’ve turned out to be the only one getting it right!) but it will at least help you steer clear of the common fallacy of assuming that the pollster showing results you like the most is the one that is most trustworthy.


-->

As well as the regular GB poll, YouGov have released new Welsh and London polls this week.

The London polling for the Evening Standard is here and has topline figures of CON 32%, LAB 42%, LDEM 7%, UKIP 10%, GRN 8%. Despite its Conservative mayor London tends to be more Labour than the country as a whole – at the last election the Labour party were two points ahead of the Tories in London, compared to the seven point Tory lead across Great Britain. This means a ten point Labour lead in London is a four point swing from Con to Labour, the equivalent of a one point Labour lead in a GB poll. In other words, the swing to Labour in London is pretty much the same as in Britain as a whole.

The Welsh polling for ITV Wales and Cardiff University is here and has topline figures of CON 23%, LAB 37%, LDEM 6%, PC 10%, UKIP 16%, GRN 8%. Compared to the general election result this is Labour up 1, the Conservatives down 3 – a swing of 2 points (so actually a smaller swing to Labour than in Britain as a whole). Roger Scully of Cardiff University’s analysis of the poll is here.


We have a bumper crop of opinion polls today – as well as the regular twice-weekly Populus poll, weekly Ashcroft poll there is the first of a series of monthly Survation polls for the Mirror. Still to come tonight is the daily YouGov poll and a ComRes telephone phone for the Indy, both due at 10pm-ish.

The three have been published so far are:

Populus – CON 34%, LAB 35%, LDEM 9%, UKIP 13%, GRN 6% (tabs)
Ashcroft – CON 32%, LAB 32%, LDEM 6%, UKIP 15%, GRN 9% (tabs)
Survation/Mirror – CON 31%, LAB 30%, LDEM 7%, UKIP 23%, GRN 3% (tabs)

All three polls have Labour and the Conservatives within one point of each other – Populus with Labour one ahead, Survation with the Tories one ahead, Ashcroft with them equal. There is more difference between the reported levels of support for the Greens and UKIP – Survation traditionally give UKIP their highest levels of support and have them up on 23% (this is clearly not just because of prompting, given ComRes, YouGov and Ashcroft also now include UKIP in their main prompt), in contrast Populus have UKIP on 13%. Green support is up at 9% in Ashcroft’s poll, but only at 3% in Survation’s. Unlike ComRes’s online polls (harsh turnout filtering) and Populus’s polls (disadvantageous weighting) there is nothing particularly unusual about Survation’s methods that would explain the low Green vote.

I will update later with the ComRes and YouGov polls.

UPDATE: The monthly ComRes telephone poll for the Independent is out and has topline figures of CON 31%(+2), LAB 30%(-2), LDEM 8%(-4), UKIP 17%(+1), GRN 7%(+2) (tabs). It’s the first time that ComRes have shown a Tory lead in their telephone polls since 2011, and a fourth poll today to show the two main parties within a single point of each other. YouGov is still to come…

UPDATE2: The last of today’s five GB polls, YouGov’s daily poll for the Sun has topline figures of CON 34%, LAB 33%, LD 6%, UKIP 15%, GRN 7%. That’s five polls today, all showing Labour and the Conservatives within 1 point of each other. As we hit the hundred days to go mark we have the closest possible race in terms of vote share, if not necessarily in seats.


The weekly YouGov/Sunday Times survey is up here and has topline figures of CON 32%, LAB 32%, LDEM 7%, UKIP 15%, GRN 7%.

Most of the rest of the survey dealt with attitudes towards the Chilcot Inquiry and Iraq. Asked in hindsight whether Britain and the US were right to take military action against Iraq support has now dwindled to 25% (down from 27% two years ago, 30% in 2007 and a peak of 66% back in April 2003, the day after the fall of Baghdad). 63% of people now think that the invasion of Iraq increased the risk of terrorist attack against Britain and 54% think it has made the world a less safe place.

Asked about Tony Blair’s role, 48% of people think Tony Blair deliberately misled the public (down 4 points from 2010), 32% think he genuinely thought Saddam Hussein had weapons of mass destruction (unchanged from 2010) – as the years pass, the proportion of people saying don’t know is gradually sneaking up. In a slightly more nuanced question, 29% of people say Blair was essentially correct to warn of the dangers of the Saddam regime, 16% that he misled Parliament but did not intend to do so, 13% that he deliberately misled Parliament, but we should now move on, 24% that he deliberately misled Parliament and should be prosecuted.

Turning to the question of the Chilcot inquiry, 50% of people think the inquiry is worthwhile, 35% of people think it is not. Despite this broad support, only 19% think it will make a genuine effort to get to the bottom of Britain’s involvement in Iraq, 53% think it will be a whitewash. Two-thirds of people think the length of time it has taken to publish the report is unreasonable.