The Times have released a new YouGov poll of party members – the report is here and the tables here.

Theresa May’s time is essentially up. Party members are normally the loyalist of the loyal, but even here there are few good words to be said. Only 20% of her own members think she is doing well and 79% think she should resign. Asked about her record, 25% of Tory party members think she has been a poor Prime Minister, 38% a terrible Prime Minister.

Let us therefore move swiftly onto her replacement. The obvious frontrunner with party members remains Boris Johnson. He is seen as a good leader by 64% to 31%, and is the first choice of 39% of party members, easily ahead of his rivals. He has the highest positive ratings on every measure YouGov asked about – 77% of party members think he has a likeable personality, 70% that he would be able to win a general election, 69% that he shares their outlook, 67% that he is up to the job, 69% that he would be a strong leader, 61% that he would be competent.

Johnson is very clearly in pole position – yet in past Conservative leadership elections the clear early frontrunner has not necessarily gone on to win (and indeed, there is no guarantee that Johnson will even reach the final round or get to be voted on by party members). One can recall the time when Michael Portillo was the obvious frontrunner to succeed William Hague, or David Davis the obvious frontrunner to succeed Michael Howard.

Looking at the rest of the field, Dominic Raab is in second place in first preferences on 13%. As the other candidate to have resigned from the cabinet – and likely to be see as a “true Brexiteer” by members – he comes closest to Johnson in the head-to-head match ups and beats ever other candidate in head-to-head figures. Considering he has a substantially lower profile than Johnson, it is a positive finding.

Of the Brexiteers in the cabinet, Michael Gove is the second best known candidate after Johnson, but polls badly on many counts. While most see him as competent and up to the job, he is not seen as capable of winning an election or having a likeable personality. Andrea Leadsom is seen as likeable, but not as an election winner. Penny Mordaunt receives high don’t know figures on most scores.

Looking at the candidates who backed Remain in the referendum, Sajid Javid seems best placed candidate from that wing of the party. In first preferences he is in joint third with Michael Gove, and in the head to head scores he would beat Hunt, Hancock, Mordaunt or Stewart (and tie with Leadsom). He scores well on being likeable, competent and up to the job, but his figures are more mixed on being seen as an election winner.

These are, of course, only the opinions of party members. While they will have the final say, they do not get a say on who makes the shortlist. That is down to MPs, and as things stand there is very scant information on who is doing well or badly among that electorate.


Today’s Sunday papers have the first polls conducted since the local elections, from Opinium and ComRes.

Opinium for the Observer have Westminster voting intentions of CON 22%(-4), LAB 28%(-5), LDEM 11%(+5), BREX 21%(+4), GRN 6%(+2), ChUK 4%(nc), UKIP 4%(nc). Fieldwork was between Wednesday and Friday, and changes are from late April. Full tables are here.

ComRes for BrexitExpress have voting intentions of CON 19%(-4), LAB 27%(-6), LDEM 14%(+7), BREX 20%(+6), GRN 5%(+2), ChUK 7%(-2), UKIP 3%(-2). Fieldwork appears to be all on Thursday, and changes are since mid-April.

Both polls have Labour and the Conservatives rapidly shedding support, with support growing for the Liberal Democrats and the Brexit party. I suspect we are seeing a combination of factors at work here, most obviously there is the continuing collapse in Conservative support over Brexit, a trend we’ve been seeing since the end of March, with support moving to parties with a clearer pro-Brexit policy. Originally that favoured UKIP too, now it is almost wholly going to the Brexit party.

Secondly there is the impact of the local elections and the Liberal Democrat successes there. For several years the Lib Dems seemed moribund and struggled to be noticed. The coverage of their gains at the local elections seems to have given them a solid boost in support, more so than the other anti-Brexit parties – for now at least, they seem to be very much alive & well again.

Third is the impact of the European elections. People are obviously more likely to vote for smaller parties in the European elections and in current circumstances obviously appear more willing to lend their vote to a different party in protest over Brexit. To some degree this will be influencing other voting intention figures as well, so I would treat Westminster voting intention figures with some scepticism in the run up to the European elections (and probably in the immediate aftermath as well, when those parties who do well will likely recieve a further boost in support).

In short, these are startling results – but we have seen startling results before (look at the polls at the height of SDP support, or just after the expenses scandal broke, or during Cleggmania). These are indeed very unusual results – the combined level of Con-Lab support in these polls are some of the very lowest we’ve seen, the Conservative share in the ComRes poll almost their lowest ever (I can find only a single Gallup poll with a lower figure, from back in 1995). What we cannot tell at the moment is whether this portends a serious readjustment of the parties, or whether things will return to more familar patterns once the European elections have passed, the Conservatives have a new leader and (assuming it ever happens) Brexit is in some way settled.

Both polls also had voting intention figures for the European Parliament elections

Opinium Euro VI – CON 11%, LAB 21%, LDEM 12%, BREX 34%, GRN 8%, ChUK 3%, UKIP 4%
ComRes Euro VI – CON 13%, LAB 25%, LDEM 14%, BREX 27%, GRN 8%, ChUK 6%, UKIP 3%

Both have the Brexit party ahead, though they are doing considerably better with Opinium than with ComRes. In both cases the Liberal Democrats have recieved a post-local election boost, putting them above the Conservatives in European voting intentions.


-->

There are three polls this weekend asking about voting intentions in the European Parliament election:

A YouGov poll conducted for Hope Not Hate has topline European election voting intentions of CON 13%(-4), LAB 22%(nc), LDEM 7%(-2), BREX 28%(+5), UKIP 5%(-1), GRN 10%(nc), ChUK 10%(+2). Fieldwork was between Tuesday and Friday, and changes are from YouGov’s previous European election poll the week before. It suggests the Brexit party continue to grow in support, largely at the expense of the Tories. Tables are here.

Opinium have topline European voting intentions of CON 14%(-3), LAB 28%(-1), LDEM 7%(-3), BREX 28%(+16), UKIP 3%(-10), GRN 6%(nc), ChUK 7%(+3). Fieldwork was Sunday to Tuesday, and changes are since the start of the month (notably, Opinium’s previous European poll was before the launch of the Brexit party, so repeats the massive transfer of support from UKIP to Brexit that we saw in YouGov’s previous poll conducted just after the Brexit party’s launch). Full tabs are here.

Finally Survation have topline figures of CON 16%, LAB 27%, LDEM 8%, BREX 27%, UKIP 7%, GRN 4%, ChUK 4%. Fieldwork was between the 17th and 25th April. Full tables are here.

All three polls have the Conservatives doing extremely badly, down in the teens. All three have the Brexit party performing strongly in the high twenties, seemingly taking over the vast majority of UKIP’s previous support (it would be unlikely that UKIP would retain any seats on the levels of support suggested here). There is more of a contrast in Labour support – YouGov have them in the low twenties, six points behind the Brexit party. Survation & Opinium have them doing better, neck-and-neck with the Brexit party for first place. Finally, Survation have Change UK on just 4%, Opinium have them on 7%, YouGov on 10%. Part of that difference will likely be down to timing – YouGov’s poll was the only one of the three polls conducted wholly after Change UK’s launch, which may well have given them at least a temporary boost.


This morning’s Times has a new YouGov poll with topline figures of CON 28%(-4), LAB 32%(+1), LDEM 11%(-1), BREXIT 8%(+3), UKIP 6%(-1), GRN 5%(+1), Change 3% (new). Fieldwork was Wednesday and Thursday and changes are since the start of April. This is the first standard YouGov poll that’s included Change UK – now they are in the process of registering as a political party I expect we’ll start to see them included in most polls.

The Conservative score of 28% is the first time YouGov have shown them dropping below 30% since 2013. While one can never be certain about what has caused changes in voting intention, it is hard to avoid the obvious conclusion that they are shedding support to more unambiguously pro-Brexit parties like UKIP and the Brexit party.

As ever, one should be cautious about reading too much into any single poll, but this is pretty much in line with other recent polling. A BMG poll last week put Labour 2 points ahead and the Conservatives down at 29%, a Survation poll this week (unusually of England & Wales only) produced a four point Labour lead. Kantar’s latest poll produced a three point Labour lead (and a startling 9 point drop in Tory support, though I suspect that was at least partially a reversion to the mean after an usually high Tory lead in their previous poll). Across the board Conservative support seems to be falling away.

The YouGov poll also included voting intention for the European elections. Initial headline figures there are CON 16%, LAB 24%, LDEM 8%, BREXIT 15%, UKIP 14%, GRN 8%, Change 7%.

I should add some caveats here. It is, obviously, very early – the European elections have only just been announced and people are unlikely to have put much if any thought towards who they will support. This early measure however suggests that the Conservatives will, as widely predicted, suffer badly. As yet they are narrowly in second place, but I would by no means assume that will hold (not least, the Brexit party will still be largely unknown and many respondents will be unaware that they are now the party of Nigel Farage, rather than UKIP, and I’d expect them to gain support as they gain publicity. Equally, it remains to be seen what impact there is on Change UK support once they officially launch as a party.

Full tabs for both questions are here.


Opinion polling on Brexit has not necessarily been the best. Highly politically contentious issues do tend to attract polling that is sub-optimal, and Brexit has followed that trend. I’ve seen several Brexit polls coming up with surprising findings based on agree/disagree statements – that is, questions asked in the form:

Do you agree with the following statement? I think Brexit is great
Agree
Disagree
Don’t know

This is a very common way of asking questions, but one that has a lot of problems. One of the basic rules in writing fair and balanced survey questions is that you should try to given equal prominence to both sides of the argument. Rather than ask “Do you support X?”, a survey should ask “Do you support or oppose X?”. In practice agree-disagree statements break that basic rule – they ask people whether they agree/disagree with one side of the argument, without mentioning the other side of the argument.

In some cases the opposite side of the argument is implicit. If the statement is “Theresa May is doing a good job”, then it is obvious to most respondents that the alternative view is that May is doing a bad job (or perhaps an average job). Even when it’s as obvious as this it still sometimes to make a difference – for whatever reason, decades of academic research into questionnaire design suggest people are more likely to agree with statements than to disagree with them, regardless of what the statement is (generally referred to as “acquiescence bias”).

There is a substantial body of academic evidence exploring this phenomenon (see, for example Schuman & Presser in the 1980s, or the recent work of Jon Krosnick) it tends to find around 10%-20% of people will agree with both a statement and its opposite, if it is asked in both directions. Various explanations have been put forward for this in academic studies – that it’s a result of personality type, or that it is satisficing (people just trying to get through a survey with minimal effort). The point is that it exists.

This is not just a theoretical issue that turns up in artificial academic experiments – they are plenty of real life examples in published polls. My favourite remains this ComRes poll for UKIP back in 2009. It asked if people agreed or disagreed with a number of statements including “Britain should remain a full member of the EU” and “Britain should leave the European Union but maintain close trading links”. 55% of people agreed that Britain should remain a full member of the EU. 55% of people also agreed that Britain should leave the EU. In other words, at least 10% of the same respondents agreed both that Britain should remain AND leave.

There is another good real life example in this poll. 42% agreed with a statement saying that “divorce should not be made too easy, so as to encourage couples to stay together”. However, 69% of the same sample also agreed that divorce should be “as quick and easy as possible”. At least 11% of the sample agreed both that divorce should be as easy as possible AND that it should not be too easy.

Examples like this of polls that asked both sides of the argument and produced contradictory findings are interesting quirks – but since they asked the statement in both directions they don’t mislead. However, it is easy to imagine how they would risk being misleading if they had asked the statement in only one direction. If that poll had only asked the pro-Brexit statement, then it would have looked as if a majority supported leaving. If the poll had only asked the anti-Leave statement, then it would have looked as if a majority supported staying. With agree-disagree statements, if you don’t ask both sides, you risk getting a very skewed picture.

In practice, I fear the problem is often far more serious in published political polls. The academic studies tend to use quite neutrally worded, simple, straightforward statements. In the sort of political polling for pressure groups and campaigning groups that you see in real life the statements are often far more forcefully worded, and are often statements that justify or promote an opinion – below are some examples I’ve seen asked as agree-disagree statements in polls:

“The Brexit process has gone on long enough so MPs should back the Prime Minister’s deal and get it done”
“The result of the 2016 Referendum should be respected and there should be no second referendum”
“The government must enforce the minimum wage so we have a level playing field and employers can’t squeeze out British workers by employing immigrants on the cheap”

I don’t pick these because they are particularly bad (I’ve seen much worse), only to illustrate the difference. These are statements that are making an active argument in favour of an opinion, where the argument in the opposite direction is not being made. They do not give a reason why MPs may not want to back the Prime Minister’s deal, why a second referendum might be a good idea, why enforcing the minimum wage might be bad. It is easy to imagine that respondents might find these statements convincing… but that they might have found the opposite opinion just as convincing if they’d been presented with that. I would expect questions like this to produce a much larger bias in the direction of the statement if asked as an agree-disagree statement.

With a few exceptions I normally try to avoid running agree-disagree statements, but we ran some specially to illustrate the problems, splitting the sample so that one group of respondents were asked if they agreed or disagreed with a statement, and a second group where asked if they agreed-disagreed with a contrasting statement. As expected, it produces varied results.

For simple questions, like whether Theresa May is doing a good job, the difference is small (people disagreed with the statement that “Theresa May is doing a good job by 57% to 15% and agreed with the statement that “Theresa May is doing a bad job” by 52% to 18%. Almost a mirror image. On some of the other questions, the differences were stark:

  • If you asked if people agree that “The NHS needs reform more than it needs extra money” then people agree by 43% to 23%. However, if you ask if people agree with the opposite statement, that “The NHS needs extra money more than it needs reform”, then people also agree, by 53% to 20%.
  • If you ask if people agree or disagree that “NHS services should be tailored to the needs of populations in local areas, even if this means that there are differences across the country as a whole” than people agree by 43% to 18%. However, if you ask if they agree or disagree with a statement putting the opposite opinion – “NHS services should be the same across the country” – then people agree by 88% to 2%!
  • By 67% to 12% people agree with the statement that “Brexit is the most important issue facing the government and should be its top priority”. However, by 44% to 26% they also agree with the statement “There are more important issues that the government should be dealing with than Brexit”

I could go on – there are more results here (summary, full tabs) – but I hope the point is made. Agree/disagree statements appear to produce a consistent bias in favour of the statement, and while this can be minor in questions asking simple statements of opinion, if the statements amount to political arguments the scale of the bias can be huge.

A common suggested solution to this issue is to make sure that the statements in a survey are balanced, with an equal amount of statements in each direction. So, for example, if you were doing a survey about attitudes towards higher taxes, rather than asking people if they agreed or disagreed with ten statements in favour of high taxes, you’d ask if people agreed or disagreed with five statements in favour of higher taxes and five statements in favour of lower taxes.

This is certainly an improvement, but is still less than ideal. First it can produce contradictory results like the examples above. Secondly, in practice it can often result in some rather artificial and clunky sounding questions and double-negatives. Finally, in practice it is often difficult to make sure statements really are balanced (too often I have seen surveys that attempt a balanced statement grid, but where the statements in one direction are hard-hitting and compelling, and in the other direction are deliberately soft-balled or unappetising).

The better solution is not to ask them as agree-disagree statements at all. Change them into questions with specific answers – instead of asking if people agree that “Theresa May is going a good job”, ask if May is doing a good or bad job. Instead of asking if people agree that “The NHS needs reform more than it needs more money”, ask what people think the NHS needs more – reform or more money? Questions like the examples I gave above can easily be made better by pairing the contrasting statements, and asking which better reflects respondents views:

  • Asked to pick between the two statements on NHS reform or funding, 41% of people think it needs reform more, 43% think it needs extra money more.
  • Asked to pick between the two statements on NHS services, 36% think they should be tailored to local areas, 52% would prefer them to be the same across the whole country.
  • Asked to pick between the two statements on the importance of Brexit, 58% think it is the most important issue facing the government, 27% think there are more important issues the government should be dealing with instead.

So what does this mean when it comes to interpreting real polls?

The sad truth is that, despite the known problems with agree-disagree statements, they are far from uncommon. They are quick to ask, require almost no effort at all to script and are very easy for clients after a quick headline to interpret. And I fear there are some clients to whom the problems with bias are an advantage, not a obstacle; you often see them in polls commissioned by campaigning groups and pressure groups with a clear interest in getting a particular result.

Whenever judging a poll (and this goes to observers reading them, and journalists choosing whether to report them) my advice has always been to go to polling companies websites and look at the data tables – look at the actual numbers and the actual question wording. If the questions behind the headlines have been asked using agree-disagree statements, you should be sceptical. It’s a structure that does have an inherent bias, and does result in more people agreeing than if the question had been asked a different way.

Consider how the results may have been very different if the statement had been asked in the opposite direction. If it’s a good poll, you shouldn’t have to imagine that – the company should have made the effort to balance the poll by asking some of the statements in the opposite direction. If they haven’t made that effort, well, to me that rings some alarm bells.

If you get a poll that’s largely made up of agree-disagree statements, that are all worded in the direction that the client wants the respondent to answer rather than some in each direction, that use emotive and persuasive phrasing rather than bland and neutral wording? You would be right to be cautious.