After the first leaders debate there was a single poll showing Ed Miliband with a better approval rating that David Cameron. It produced a typical example of rubbish media handling of polls – everyone got all excited about one unusual poll and talked about it on news bulletins and so on, giving it far more prominence than the far greater number of polls showing the opposite. Nevertheless, it flagged up a genuine increase in Ed Miliband’s ratings.

I talk about leader ratings, but the questions different companies ask actually vary greatly. Opinium, for example, ask if people approve or disapprove of what each leader is doing, YouGov ask if they are doing a well or badly, ICM if they are doing a good or bad job. To give an obvious example of how these could produce different figures, UKIP have quadrupled their support since the election, so objectively it’s quite hard to argue that Nigel Farage hasn’t done well as UKIP leader…but someone who supported EU membership and freedom of movement probably probably wouldn’t approve of his leadership.

The graph below shows the net ratings for David Cameron and Ed Miliband from the four pollsters who ask leader ratings at least monthly (ComRes, ICM and Ashcroft all ask their versions of the question too, but not as frequently).

leaderatings

You can see there is quite a lot of variation between pollsters, but the trends are clear. Ed Miliband’s ratings have improved over the course of the campaign, though on most pollsters’ measures he remains significantly behind David Cameron. The main exception is Survation’s rating – I suspect this is because of the time frame of their question (Survation ask people to think specifically about the last month, other companies just ask in general – my guess is that the difference is people thinking Ed Miliband has done well in the campaign). Cameron’s ratings too have improved from three of the four pollsters, but not to the extent of Miliband’s.

What’s the impact of this? Theoretically I suppose it makes the potential for voters to be deterred from voting Labour by their lack of confidence in Ed Miliband that bit smaller. Whether that’s any real difference or not is a different matter – on one hand, while it is one of the Conservative party’s hopes that Miliband’s poor ratings will drive people towards voting Tory at the last minute, that’s very different to it actually happening. This could be the case of a window closing upon something that didn’t see to be happening anyway. The alternative point of view is that no one realistically expects some vast last swing producing a Tory landside – we are talking about grinding out a few percentage points at the margins. Hence Miliband’s ratings overall don’t necessarily make much difference – much of the increase is amongst Labour’s own voters anyway – it’s how he is seen amongst those small groups who are undecided whether or not to vote for Labour, and those who undecided whether or not to vote Tory to stop Miliband.

As we come to the final weeks of the election campaign, that’s a key to understanding a lot of polling. Most voters have already made their minds up (and even many of those who say they haven’t, are probably less likely to switch than they think they are). As postal votes have started to go out, an increasing number of people will have actually voted anyway. Most things that happen over the next two and a half weeks will have no impact on public opinion anyway – for those that do, it’s not national opinion that will make a difference, it’ll be the impact on that dwindling group of people who may yet be persuaded.


ComRes have published a new poll of voting intentions in LD-Con seats in the South West for ITV. Full details are here. The topline figures are CON 44%, LAB 13%, LDEM 26%, UKIP 10%. Given these are all seats that the Liberal Democrats won in 2010 this is a huge turnaround – in 2010 the Lib Dems had an overall lead of 8.5% over the Tories in these seats, now they are 18 points behind, a whopping great swing of 13 points. If there was a uniform swing of this scale across these seats the Lib Dems would lose the lot.

Depressing for the Lib Dems, but wholly at odds with previous polling evidence in these seats. Lord Ashcroft has polled Lib Dem held seats pretty comprehensively, so we actually have constituency polls in 12 of the 14 seats included in this sample, and they paint a very different picture. Compared to the 13 point LD>Con swing in the ComRes poll Lord Ashcroft found an average LD>Con swing of about 4 points.

The difference between these two sets of polling is much larger than can explained by margin of error – they paint a genuinely contradictory picture. If ComRes are right the Lib Dems have collapsed in their heartland and face wipeout, if Ashcroft are right they are holding up against the tide and should retain around half those seats.

Explaining the difference is a little harder. It could, of course, simply be that public opinion has changed – some of Ashcroft’s polling was done late last year… but most of the Lib Dem collapse in support came early this Parliament, so this doesn’t ring true to me. Looking at the rest of the methodology both polls were conducted by telephone, the political weighting was much the same, the turnout weighting not vastly different.

My guess is the difference is actually a quite a subtle one – but obviously with a large impact! Both Ashcroft and ComRes asked a voting intention question that prompted people to think about their own constituency, candidates and MP to try and get at the personal and tactical voting that Lib Dem MPs are so reliant upon. However, looking at the tables it looks as though ComRes asked that as the only voting intention question, while Ashcroft asked it as a two stage question, asking people their national preference then their local voting intention. The results that ComRes got in their constituency question are actually extremely similar to the ones that Ashcroft got in his initial, national question.

This sounds weird, but it’s actually what I’d expect. When I first wrote the two stage voting intention question back in 2008 my thinking was that when people answer opinion polls they want to register their support for the party they really support, not a tactical vote or a vote for their local MP… and even if you ask the question slightly differently, that’s the answer you are going to get. If you really wanted to get people’s local voting intentions, you needed to first give them the opportunity to express their national support and then ask them their local support.

That though, is just the theory. As I’ve written before when writing about constituency polls of Lib Dem seats and marginal polls of Lib Dem battlegrounds, we don’t really have the evidence from past elections to judge what the most accurate methods are. Hopefully we’ll get enough different constituency and marginal polls over the next three weeks to give us the evidence to judge in the future.

Meanwhile tonight’s YouGov poll for the Sun has topline figures of CON 34%, LAB 35%, LDEM 8%, UKIP 13%, GRN 5%


-->

ComRes have an interesting post over on their site about differences between online and telephone polling so far this year (as well as making some extremely sensible points about the polls not being all over the place). As they correctly say, telephone polls this year have been showing a tiny Conservative lead, online polls a tiny Labour one. It’s only a small difference, but it’s there and it is not new – at the start of the year I produced a chart showing house differences between the different polling companies over 2014, and even then an online vs telephone tendency was observable: the two most “Toryish” polls were Ipsos MORI and ICM, both done by telephone. The most “Laboury” polls were TNS and Opinium, both done online.

Look a little closer though, and things are not quite that cut and dried. There are many causes of variation between polls, telephone or online fieldwork is just one of them. There is variation between different online companies and between different phone companies. Last year ComRes’s telephone polls actually produced some of the more Laboury figures, the online Populus polls tended be on the Tory side of average. Below is the average for each company so far this year (given the polls have been pretty static in 2015 I haven’t worried too much about timings of different companies polls, it’s just a straight average).

phoneonlineleads

So all three companies who have been showing a Tory lead are done by phone, all the online polls have been showing an average Labour lead. But note the variation – MORI use the telephone, but they are showing a Labour lead on average. Two online polls (YouGov and Opinium) show barely any Labour lead at all, Survation, TNS and Panelbase average around a 2 point Labour lead. This is because there are plenty of other reasons for variation between pollsters too, different approaches to weighting, turnout, don’t knows and so on – I summarised lots of them here. Just looking at one can sometimes be misleading, for example, ICM and Ashcroft also reallocate don’t knows by past vote, which normally bumps up the Tory position by a point or so, so that will also be a major part of the difference between them and companies showing worse results for the Conservatives (one should also bear in mind that the monthly polling companies have only produced 3 or 4 polls this year – so a single odd poll like ICM’s this month has a large impact on the average).

I’ve no doubt that telephone vs online is one of the reasons for differences though, especially when it comes to UKIP. The graph below has even starker differences. With Labour vs Conservatives the difference between phone and online polls is a matter of a few points. With UKIP there is a vast gulf between the figures from different pollsters…

phoneonlineukip

The companies showing lower UKIP scores are all telephone. The companies showing higher UKIP scores are all online. While there is little difference between the phone company showing the highest UKIP support (Ashcroft) and the online company showing the lowest (YouGov), there is a gulf of 9 points between the highest and lowest ends of the scale. Why there should be such a difference between online and telephone polling of UKIP we cannot tell – some of it may be an interviewer effect (people being more willing to tell a computer screen they are voting for a non-mainstream party than a human interviewer), some of it may be sampling (some online samples getting too many of the sort of people who vote UKIP, or some phone samples getting too few, or both). Until the results are in we won’t really know.


Lord Ashcroft released a new batch of constituency polls this afternoon, this time returning to ten Conservative -vs- Labour seats where he found a tight battle last time round. Full details are here.

I normally look at the average swing across the groups of seats that Lord Ashcroft polls, but I’d be wary of reading too much into that this time. Because Lord Ashcroft has gone back to the tight races, these are seats that were showing a smaller than average swing before (an average of 2 points from Con to Lab). They still show a lower than average swing of about 2 points…but that’s probably because it’s a sample made up of seats that were showing a lower swing anyway, rather than a sign of a wider pattern.

Most of the seats don’t show much change in the Lab-Con race since Ashcroft previously polled them last year. The biggest differences are in Harrow East, where Labour are now ahead, and in Loughborough and Kingswood, previously tight races but now with healthier Tory leads. Most of the polls showed a drop in UKIP support, but none of these are UKIP target seats and the previous wave of polling in most of these seats was Sep-Oct when UKIP were on a Carswell related high, so this is to be expected. A positive finding for Labour in these seats is that they are ahead in the ground war – on average 71% of people recall being contacted by Labour over the last few weeks, compared to 59% who recall being contacted by the Tories.

Elsewhere, last night’s YouGov poll for the Sun had topline figures of CON 33%, LAB 35%, LDEM 8%, UKIP 14%, GRN 5% (tabs). Nothing particularly unusual, but note that YouGov are now on their election footing, meaning they weight by likelihood to vote in a similar way to ICM and Ashcroft polls (so people who say they are 10/10 certain to vote get a weight of 1.0, people who say they are 9/10 likely to vote get a weight of 0.9 and so on). In past elections this has tended to slightly favour the Conservatives, but this time round it isn’t actually making any substantial difference at all. YouGov have also changed their sampling slightly – taking samples from people who polled in January and February (a period when Labour had a very slight lead in the polls) and weighting them using Jan/Feb vote, rather than party ID from back in 2010.

It also means they are now seven days a week, so we’ll be getting a fresh YouGov poll every night up until the election.


In this post back in January I wrote about the partisan effects of the different methodologies the different polling companies used, of how some companies tend to show consistently higher or lower scores for different parties. Since then I’ve been meaning to do a reference post explaining those different methods between pollsters. This is that – an attempt to do a summary of different companies methods in one place so you can check whether company A prompts for UKIP or what company B does with their don’t knows. As ever, this is from the published methodology details of each company and my own understanding of it – any mistakes are mine and corrections are welcome!

Phone polls

There are four regular telephone polls – Ipsos MORI, ICM, Ashcroft and ComRes/Daily Mail (ComRes do both telephone and online polls). All phone polls are conducted using Random Digit Dialing (RDD) – essentially taking phone numbers from the BT directory and then randomising the digits at the end to ensure the sample includes some ex-directory numbers, all polls will now also include some mobile phone numbers, though the pollsters have all said this makes little actual difference to results and is being done as a precaution. All telephone polls are weighted by some common demographics, like age, gender, social class, region, housing tenure, holidays taken and car ownership.

Ipsos MORI

Now the most venerable of the regular pollsters, Ipsos MORI are also the most traditional in their methods. They currently do a monthly political poll for the Evening Standard. Alone among GB pollsters they use no form of political weighting, viewing the problem of false recall as unsurmountable, their samples are weighted using standard demographics, but also by public and private sector employment.

MORI do not (as of March 2015) include UKIP in their main prompt for voting intention. For people who say don’t know, MORI ask who people who are most likely to vote for and count that equally as a voting intention. People who still say don’t know or won’t say are ignored. In terms of likelihood to vote, MORI have the tightest filter of any company, including only those respondents who say they are absolutely 10/10 certain to vote.

ICM

ICM are the second oldest of the current regular pollsters, and were the pioneer of most of the methods that became commonplace after the polling industry changed methods following the 1992 debacle. They currently do a monthly poll for the Guardian. They poll by standard demographics and by people’s past vote, adjusted for false recall.

ICM don’t currently include UKIP in their main voting intention prompt. People who say they don’t know how they will vote are reallocated based on how they say they voted at the previous election, but weighted down to 50% of the value of people who actually give a voting intention. In terms of likelihood to vote, ICM weight by likelihood so that people who say they are 10/10 certain to vote are fully counted, people who say they are 9/10 likely to vote count as 0.9 of a vote and so on. Additionally ICM weight people who did not vote at the previous election down by 50%, the only pollster to use this additional weighting.

Ashcroft

Lord Ashcroft commissions a regular weekly poll, carried out by other polling companies but on a “white label” basis. The methods are essentially those Populus used to use for their telephone polls, rather than the online methods Populus now use for their own regular polling. Ashcroft polls are weighted by standard demographics and by past vote, adjusted for false recall.

Ashcroft’s voting intention question has included UKIP in the main prompt since 2015. People who say they don’t know how they will vote are reallocated based on how they say they voted at the previous election, but at a different ratio to ICM (Ashcroft weights Conservatives and Labour down to 50%, Lib Dems down to 30%, others I think are ignored). In terms of likelihood to vote, Ashcroft weights people according to how likely they say they are to vote in similar way to ICM.

ComRes

ComRes do a monthly telephone poll, previously for the Independent but since 2015 for the Daily Mail. This is separate to their monthly online poll for the Independent on Sunday and there are some consistent differences between their results, meaning I treat them as two separate data series. ComRes’s polls are weighted using standard demographics and past vote, adjusted for false recall – in much the same way as ICM and Ashcroft.

ComRes have included UKIP in their main voting intention prompt since late 2014. People who say they don’t know how they will vote or won’t say are asked a squeeze question on how they would vote if it was a legal requirement, and included in the main figures. People who still say don’t know are re-allocated based on the party they say they most closely identify with, though unlike the ICM and Ashcroft reallocation this rarely seems to make an impact. In terms of likelihood to vote ComRes both filter AND weight by likelihood to vote – people who say they are less than 5/10 likely to vote are excluded completely, people who say they are 5/10 to 10/10 are weighted according to this likelihood.

Online Polls

Online poll sampling can be somewhat more opaque than telephone sampling. In most cases they are conducted through existing panels of online volunteers (either their own panels, like the YouGov panel or PopulusLive, or panels from third party providers like Toluna and Research Now). Surveys are conducted by inviting panellists with the required demographics to complete the poll – this means that while panels are self-selecting, surveys themselves aren’t (that is, you can choose to join a company’s online panel, but you can’t choose to fill in their March voting intention survey, you may or may not get randomly invited to it). Because panellists demographics are known in advance, pollsters can set quotas and invite people with the demographics to reflect the British public. Some pollsters also use random online sampling – using pop-ups on websites to randomly invite respondents. As with telephone polling, all online pollsters use some common demographic weighting, with all companies weighting by things like age, gender, region and social class.

YouGov

YouGov are the longest standing online pollster, currently doing daily voting intention polls for the Sun and Sunday Times. The length of time they have been around means they have data on their panellists from the 2010 election (and, indeed, in some cases from the 2005 election) so their weighting scheme largely relies on the data collected from panellists in May 2010, updated periodically to take account of people who have joined the panel since then. As well as standard demographics, YouGov also weight by newspaper readership and party identification in 2010 (that is, people are weighted by which party they told YouGov they identified with most in May 2010, using targets based on May 2010).

YouGov have included UKIP in their main prompt since January 2015. They do not use any weighting or filtering by likelihood to vote at all outside of the immediate run up to elections (in the weeks leading up to the 2010 election they weighting by likelihood to vote in a similar way to Ashcroft, Populus and ICM). People who say don’t know are excluded from final figures, there is no squeeze question or reallocation.

Populus

Populus used to conduct telephone polling for the Times, but since ceasing to work for the Times have switched to carrying out online polling, done using their PopulusLive panel. Currently they publish two polls a week, on Mondays and Fridays. As well as normal demographic weightings they weight using party identification, weighting current party ID to estimated national targets.

Populus have included UKIP in their main prompt since February 2015. They weight respondents according to their likelihood to vote in a similar way to ICM and Ashcroft. People who say don’t know are excluded from final figures, there is no squeeze question or reallocation.

ComRes

Not to be confused with their telephone polls for the Daily Mail, ComRes also conduct a series of monthly online polls for the Independent on Sunday and Sunday Mirror. It is conducted partially from a panel, partially from random online sampling (pop-ups on websites directing people to surveys). In addition to normal demographic weightings they weight using people’s recalled vote from the 2010 election.

ComRes have included UKIP in their main prompt since December 2014. Their weighting by likelihood to vote is slightly different to their telephone polls – for the Conservatives, Labour and Liberal Democrats it’s the same (include people who say 5+/10, and weight those people according to their likelihood) but for UKIP and Green I believe respondents are only included if they are 10/10 certain to vote. Their treatment of don’t knows is the same as in their phone polls: people who say they don’t know how they will vote or won’t say are asked a squeeze question and included in the main figures, people who still say don’t know are re-allocated based on the party they say they most closely identify with.

Survation

Survation do a regular poll for the Daily Mirror and occasional polls for the Mail on Sunday. Data is weighted by the usual demographics, but uses income and education rather than social class. Recalled 2010 vote is used for political weighting. Survation have included UKIP in their main prompt for several years. They weight by likelihood to vote in the same way as ICM, Populus and Ashcroft. People who say don’t know are reallocated to the party they voted for in 2010, but weighted down to 30% of the value of people who actually give a voting intention.

Note that Survation’s constituency polls are done using a completely different method to their national polls, using telephone sampling rather than online sampling and different weighting variables.

Opinium

Opinium do regular polling for the Observer, currently every week for the duration of the election campaign. Respondents are taken from their own panel and is weighted by standard demographics. Historically Opinium have not used political weighting, but from February 2015 they switched to weighting by “party propensity” for the duration of the election campaign. This is a variable based on which parties people would and wouldn’t consider – for practical purposes, it seems to be similar to party identification.

Opinium do not include UKIP in their main prompt (meaning they only appear as an option if a respondent selects “other”). They filter people by likelihood to vote, including only respondents who say they will definitely or probably vote. People who say don’t know are excluded from the final figures.

TNS

TNS are a huge global company with a long history in market research. In terms of public opinion polling in this country they are actually the successors to System Three – who used to be a well known Scottish polling company and ended up part of the same company through a complicated series of mergers and buy-outs by BMRB, NFO, Kantar and WPP, currently their ultimate parent company. At the last election TNS were the final company doing face-to-face polling, since then they have switched over to online. The sample is taken from their Lightspeed panel and is weighted using standard demographics and recalled 2010 vote. TNS do include UKIP in their main prompt, and also prompt for the BNP and Green. TNS filter and weight people according to likelihood to vote and exclude don’t knows and won’t says from their final figures.

Putting all those together, here’s a summary of the methods.

methods2

As to the impact of the different methods, it not always easy to say. Some are easy to quantify from published tables (for example, ICM and Ashcroft publish their figures before and after don’t knows are reallocated, so one can comfortably say “that adjustment added 2 points to the Lib Dems this week”), others are very difficult to quantify (the difference the choice of weighting regimes makes is very difficult to judge, the differences between online and telephone polling even more so), many methods interact with one another and the impacts of different approaches changes over time (a methodology that helps the Tories one year may help the Lib Dems another year as public opinions change). Rather than guess whether each pollsters methods are likely to produce this effect or that effect, probably best to judge them from actual observed results.

UPDATE:
TNS have confirmed they do prompt for UKIP, and also prompt for the BNP and Green – I’ll update the table later on tonight.