Mark Blumenthal over at Pollster.com has drawn my attention to this article by David Runciman in the LRB which, amongst other things, complains about quite how “rubbish” polls in the USA are with their 600-700 samples, compared to “1000-2000″ samples in the UK.

I haven’t been following particularly close attention to the US polls (why would I? Obama the democrat nomination was effectively settled months ago, but the national polls will be distorted until Clinton finally accepts it and goes), but the comparison is somewhat unfair.

Firstly, he’s not comparing apples with apples. American polls normally quote as their sample size the number of likely voters, it is typical to see a poll reported as being amongst 600 “likely voters”, with the number of “unlikely voters” screened out to reach that eventual figures not made clear. In contrast, British polling companies normally quote as their sample size the number of interviews they conducted, regardless of whether those people were filtered out of voting intention questions. So, voting intentions in a UK poll with a quoted sample size of 1000, may actually be based upon 700 or so “likely voters”.

To give a couple of examples, here’s ICM’s latest poll for the Guardian. In the bumpf at the top the sample size is given as 1,008. Scroll down to page 7 though and you’ll find the voting intention figures were based on only 755 people. Here’s Ipsos-MORI’s April poll – the quoted sample size is 1,059, but the number of people involved in calculating their topline voting intention once all the unlikelies have been filtered out was only 582.

Including a lot of people isn’t necessarily a good thing anyway. Primaries are low turn participation events in the USA, not a lot of people vote in them, and the challenge for US pollsters is filtering out all those people who won’t actually take part. Getting lots of people per se can be a bad thing if those people won’t actually vote, the aim is getting the right people. Considering the rather shaky record of most British pollsters in some low turnout elections like by-elections, Scottish elections, the London mayoralty and so on, we really aren’t the experts on that front.

Of course, most of Runciman’s other points about poor media reporting of polls are music to my ears – it is misleading for newspapers to headline findings based on only 250 or so people without appropriate caveats. I first started blogging about polls because media coverage of polls was so myopic, reporting only their own polls in isolation, splashing polls that look like obvious outliers as front page sensenations and so on. Just, the US pollsters aren’t partcularly bad, nor UK pollsters particularly good. When it comes to low turnout elections like primaries, the polls produced by USA pollsters may not be that great, but it’s because low turnout elections are hard to poll, I suspect if we had similar elections over here we’d be just as bad.


YouGov’s monthly poll for the Telegraph, the first since the Crewe by-election, is another appalling result for Labour. The topline figures, with changes from YouGov’s last poll, are CON 47%(+2), LAB 23%(-2), LDEM 18%(nc).

The Conservatives now have a 17 point lead on the economy and David Cameron has a 22 point lead as best Prime Minister. Gordon Brown’s net approval rating stands at minus 60, which is the worst ever rating I can find for a Prime Minister (the worst John Major ever hit was minus 59 in August 1994).

With the sole caveat that this was conducted soonish after a by-election victory so Cameron will have something of an aura about him, there little else to add – the figures speak for themselves and the picture for the government is bleak.


Waiting for YouGov

Despite the Crewe and Nantwich by-election’s impact on the media’s view of politics, we are still awaiting the first poll since the by-election. The result of the by-election came too late for a full poll in time for the last week’s Sunday papers, and it looks as though all the pollsters avoiding doing anything over a bank holiday weekend immediately after a big by-election win, which could potentially have produced some real comedy figures.

We are finally getting close to some figures though, as we should have YouGov’s monthly poll for the Telegraph either late on tonight or (if the Telegraph publish on Saturday) tomorrow night. The last YouGov poll had a 20 point Tory lead – the by-election may have had a further “aura” effect on the Tories, or Gordon Brown may have been further damaged by the leadership speculation around him. On the other hand, we may have reached a point where Labour are down to a hardcore of support where bad publicity can’t really hurt them much more. Feel free to hazard some guesses in the comments below.

In the meantime, here’s a round up of some other recent polls. YouGov’s monthly inflation expectation tracker for Citybank shows that the average expectation of the inflation figure in 12 months time is now 4.1%, the highest they have found since the tracker began in 2005.

A Populus poll for the Movement of Reform Judaism found 73% of Christians think God is male (some of the reporting also included figures for Muslims, Hindus and so on, but these were based on pathetically small numbers of respondents – 13 Muslims, 11 Hindus and so on, so are best ignored). 42% of people, including 53% of Christians, thought it was right to refer to God solely as “He”, with 31% (including 28% of Christians) disagreeing.

Finally, the British Polling Council has ruled on the complaint against MORI about a Transport for London poll that they refused to release the tables for prior to the mayoral election. The BPC ruled that MORI should have released the tables, and advised that BPC members in future should review their contracts to make sure they didn’t conflict with the BPC disclosure rules (implying that MORI’s problem was their contract with TfL didn’t allow them to release the figures). The tables are up on MORI’s website here. Why Transport for London were so sniffy about letting MORI release them is a mystery to me, the questions are just as laid out in the information that TfL had published, and the cross-breaks don’t reveal anything nefarious, or indeed, particularly interesting.


Everyone will know the result of the Crewe and Nantwich by-election by now, but what about the polls during the campaign, how well did they do? By-election polls have been a rare creature in recent years, but such was the attention paid to this contest that we saw three of them, two from ICM and one from ComRes.

All of them showed the Conservatives in the lead, so no pollsters looked silly this morning, but apart from having the right party in the lead they were actually a long way from the result.

ICM/Mail on Sunday (May 8th) – CON 43%, LAB 39%, LDEM 16%
ICM/News of the World (May 16th) – CON 45%, LAB 37%, LDEM 14%
ComRes/Independent (May 18th) – CON 48%, LAB 35%, LDEM 12%
Result (May 22nd) – CON 49.5%, LAB 30.6%, LDEM 14.6%

ComRes were very close to the actual level of support for the Conservatives, but everybody overestimated the level of support Labour would actually get, and ICM especially were well short of the actual 19 point Tory lead.

Firstly we should add a caveat that all the polls were done several days before polling day – ICM’s last poll was 6 days previous, leaving 6 days for people to change their minds. Given the nature of by-election it’s perfectly possible the electorate swung even further behind the Tories in those final days (such was its brevity, 6 days was a quarter of the whole campaign!).

I suspect the actual reason was the re-allocation of don’t knows that both ICM and ComRes did. ICM assumed that 50% of people who said they didn’t know how they would vote would end up voting for the party that they voted for in 2005, ComRes reallocated all their don’t knows to the party they voted for last time. Without those adjustments their figures would have been:

ICM/Mail on Sunday (May 8th) – CON 51%, LAB 30%, LDEM 15%
ICM/News of the World (May 16th) – CON 49%, LAB 34%, LDEM 13%
ComRes/Independent (May 18th) – CON 49%, LAB 34%, LDEM 12%
Result (May 22nd) – CON 49.5%, LAB 30.6%, LDEM 14.6%

Which are all far closer to the actual result than the adjusted figures were, it looks as though all those Bashful Brownites that ICM and ComRes allowed for never turned up at the polling stations.

ICM’s re-allocation of don’t knows is based on solid research from past elections that shows don’t knows do tend to break in favour of the party they’ve supported in the past, so I don’t intend this to be a criticism of their approach to general election polling. It just appears that it doesn’t work when it comes to a by-election.


Abortion Polling

Over on Bloggerheads Tim Ireland dismisses a poll on Abortion quoted during yesterdays Parliamentary debate as “being conducted by the Christian Institute [so] it’s on him if the poll turns out to have been conducted on the back of a hymn sheet in a church car park.”

The tables for the poll are here, and while it was commissioned by the Christian Institute, it was carried out by ComRes, a proper polling company using proper a quasi-random phone sample.

Where the problems begin is with the questions themselves. They didn’t ask people straight about what they thought the time limit for abortion should be, they first primed them with an argument in favour of reducing it. Respondents were told that in other European countries the limit was 12 weeks, and then asked their opinion. 58% thought the time limit should be reduced, with 24% of women taking the hint and picking 12 weeks. In a second question respondents were told that in one neonatal unit 5 out of 7 babies born at 22 weeks survived. 60% then thought the time limit should be reduced from 24 weeks.

Now, questions like this do have legitimate uses in message testing or deliberative polling to see how well arguments work to change opinions. If they are presented in the correct way, they are perfectly good questions – for example, the Christian Institute published the first question as being “whether they thought the UK should lower its abortion time limit in light of the fact that in most other EU countries the limit is 12 weeks or lower“, which is exactly what was asked.

What the questions don’t show is that X percentage of people want to see the time limit for abortion reduced, anymore than a question prefaced with a pro-choice argument would show people opposed a reduction. The best way to ask a survey question is to give the minimal amount of information, since for every bit of background information you provide you risk skewing the answer or, by making them better informed than other people, making your sample unrepresentative.

The cynical old souls reading this will jump to the conclusion that clients go around deliberately asking pollsters for skewed polls that give them the answers they want. In my experience it doesn’t actually work like that. Most common is that clients think that other polls are skewed, because the public don’t understand, and if they were aware of this vital bit of information they would be much better informed and the answers so much more reflective of what they really think. Then we have to explain that actually, polls are supposed to measure public opinion as it is, not how it we would like it to be if they were better informed. It normally isn’t an attempt to mislead, it’s often just misunderstanding of what fair question wording is.

Of course, even a stopped clock tells the right time twice a day, so just because these questions can’t be taken to show it, doesn’t mean a majority of people don’t support a shorter time limit on abortion. Polls with less skewed wording also show support for a reduction. A YouGov poll in the Sunday Times two months ago asked if people supported the status quo, or the 20 week amendment. 48% supported 20 weeks compared to 35% supporting 24% weeks, 8% wanted it banned altogether (other options weren’t offered, so no doubt some people would have gone for 22 weeks if they could). A MORI poll for the Observer in 2006 found 33% thought the current limit was right, 4% wanted a longer limit, 42% a tighter limit and 10% a total ban.