Just a quick line to point out that the Constituency guide part of the site has now been updated to reflect the general election results and the new MPs elected, including the target and defence lists for the parties (SNP and UKIP to follow). Before anyone points it out there’s still lots to do – including new swingometers and updating MPs profiles to reflect the reshuffles.

Last week Opinium’s Observer poll was conducted prior to the Paxman interview, so I wasn’t sure whether the poll from them tonight would be “post-debate” or not. In the event it is – conducted on Friday and late on Thursday night.

As with Survation’s poll yesterday there is no sign of any significant post-debate shift in voting intention, with topline figures of CON 33%(-1), LAB 33%(nc), LDEM 7%(-1), UKIP 14%(+1), GRN 7%(nc). Full tabs are here.

Asked who won they think won the debate 20% of people who claimed they saw at least some of it said Nicola Sturgeon won, followed by Cameron on 17% and Miliband on 15%. Leader ratings show some movement, but not quite as dramatic as those in Survation’s poll (that may be a result of question wording – Opinium ask a general question about how the leaders are doing, Survation ask specifically about how they’ve been doing in the last month). David Cameron’s rating here is unchanged on plus 1, Miliband is up six points on minus 15, Nick Clegg is up ten points on minus 30, Nigel Farage is up eleven on minus 13.


ComRes have a new poll of Rochester and Strood out tonight that shows UKIP with a solid lead. As far as I can recall it’s the first ComRes by-election poll this Parliament. Like all constituency polls it was done by telephone, and with a healthy sample size by constituency polling standards of 1500.

Topline figures are CON 30%, LAB 21%, LDEM 3%, UKIP 43%, GREEN 3%. The only previous Rochester & Strood poll was by Survation at the start of the month – that showed a nine point lead for UKIP. Obviously one has to be careful about direct comparisons between polls from different pollsters using different methodologies, so it would be wrong to draw too many conclusions about how opinion might have moved between the two polls (differences could be down to methods), but it certainly doesn’t show any obvious sign of the Conservatives eating into UKIP’s early lead.

As I write there is still one council to declare, but the maths mean that the overall result is going to be Yes 45%, No 55%. So, how did the polls do?

The final pre-election polls had all tightly converged around the same figures – Yes 48%, No 52%, with every company was within one point of this. In fact the level of No support was three points higher than this. For a single poll a three point error would be within the margin of error, but every poll being off in the same direction suggests some systemic error.

A possibility is the shy noes/enthusiastic yesses we discussed before the referendum, but on the face of it a simpler explanation is just late swing. The YouGov recontact survey on the day, going back to the same people they interviewed in their final survey found enough movement between final survey to actually voting to take the figures to YES 46%, NO 54%, one point from the actual result and enough to explain the apparent divergence. From that it looks as though no was going to win anyway, but there was a further movement from yes to no when people actually got to the polling station.

It’s been a long journey, but we’ve finally arrived at the eve-of-referendum polls. For a lot of the Scottish referendum campaign the discussion about polls was one of right or wrong – we had lots of polls showing the same trend (flatlining!), but showing different absolute figures. Companies like MORI, TNS and YouGov were showing big NO leads; companies like Panelbase and Survation were showing a tight race. Then we had a period of some companies showing a strong movement towards YES, some not, and we have ended up with everyone showing much the same figures (what was the true picture earlier in the campaign we will never know for sure – by definition you can check eve-of-election results against reality, but never mid-term ones). With one MORI poll still to come, here are the YES shares in the latest polls from each company (taking the online and telephone methodologies seperately for those companies who have done both):

Ipsos MORI (phone) 49%
ICM (phone) 49%
TNS (face to face) 49%
YouGov (online) 48%
Panelbase (online) 48%
ICM (online) 48%
Opinium (online) 48%
Survation (online) 48%
Survation (phone) 47%%

Essentially everyone is predicting the same result, the margin of error on most of the polls is around plus/minus 3%, every poll is within two percentage points of the others. This isn’t going to be a case of individual pollsters getting it right or wrong, they’ll either all be around about right or all be horribly out.

There’s a temptation when the polls are like this to say YES and NO are within the margin of error, that it’s “too close to call”. It doesn’t really work like that – these polls are showing NO ahead. The margin of error is on each individual poll, and it’s equally likely to happen in both directions. Hence if the “true” balance of public opinion in Scotland was 50/50 we’d expect to see a random scattering of results around that point, some polls showing yes, some polls showing no. We’re not seeing that. We’re seeing polls randomly scattered around the 48/52 mark, suggesting that’s most likely where public opinion is – a very small lead for the NO campaign.

It’s possible there will be a very late swing, that people will have changed their minds in the last few hours or in the polling station itself. In most polls there really aren’t that many don’t knows left to make their minds up though.

The alternative route to an upset is if the polls are wrong, if there is some systemic issue above and beyond normal random sampling error that affects polls from all the companies. I wrote yesterday about what the potential risks are – the main challenges in my view are first whether people who are on the fringes of society and normally play little part in politics don’t get picked up in polls but do vote; and secondly whether there has been an issue of differential response rate, have the obviously more enthusiastic yes voters been more willing to take part in polling that no voters are?

Personally I’m a little more worried about the latter – I think there’s more chance of the polls ending up underestimating the NO vote than the YES vote, but there comes a time when you just have to trust the data. The polls say the result will be around about YES 48%, NO 52%. We will see on Friday morning.