Opinion polling on Brexit has not necessarily been the best. Highly politically contentious issues do tend to attract polling that is sub-optimal, and Brexit has followed that trend. I’ve seen several Brexit polls coming up with surprising findings based on agree/disagree statements – that is, questions asked in the form:

Do you agree with the following statement? I think Brexit is great
Agree
Disagree
Don’t know

This is a very common way of asking questions, but one that has a lot of problems. One of the basic rules in writing fair and balanced survey questions is that you should try to given equal prominence to both sides of the argument. Rather than ask “Do you support X?”, a survey should ask “Do you support or oppose X?”. In practice agree-disagree statements break that basic rule – they ask people whether they agree/disagree with one side of the argument, without mentioning the other side of the argument.

In some cases the opposite side of the argument is implicit. If the statement is “Theresa May is doing a good job”, then it is obvious to most respondents that the alternative view is that May is doing a bad job (or perhaps an average job). Even when it’s as obvious as this it still sometimes to make a difference – for whatever reason, decades of academic research into questionnaire design suggest people are more likely to agree with statements than to disagree with them, regardless of what the statement is (generally referred to as “acquiescence bias”).

There is a substantial body of academic evidence exploring this phenomenon (see, for example Schuman & Presser in the 1980s, or the recent work of Jon Krosnick) it tends to find around 10%-20% of people will agree with both a statement and its opposite, if it is asked in both directions. Various explanations have been put forward for this in academic studies – that it’s a result of personality type, or that it is satisficing (people just trying to get through a survey with minimal effort). The point is that it exists.

This is not just a theoretical issue that turns up in artificial academic experiments – they are plenty of real life examples in published polls. My favourite remains this ComRes poll for UKIP back in 2009. It asked if people agreed or disagreed with a number of statements including “Britain should remain a full member of the EU” and “Britain should leave the European Union but maintain close trading links”. 55% of people agreed that Britain should remain a full member of the EU. 55% of people also agreed that Britain should leave the EU. In other words, at least 10% of the same respondents agreed both that Britain should remain AND leave.

There is another good real life example in this poll. 42% agreed with a statement saying that “divorce should not be made too easy, so as to encourage couples to stay together”. However, 69% of the same sample also agreed that divorce should be “as quick and easy as possible”. At least 11% of the sample agreed both that divorce should be as easy as possible AND that it should not be too easy.

Examples like this of polls that asked both sides of the argument and produced contradictory findings are interesting quirks – but since they asked the statement in both directions they don’t mislead. However, it is easy to imagine how they would risk being misleading if they had asked the statement in only one direction. If that poll had only asked the pro-Brexit statement, then it would have looked as if a majority supported leaving. If the poll had only asked the anti-Leave statement, then it would have looked as if a majority supported staying. With agree-disagree statements, if you don’t ask both sides, you risk getting a very skewed picture.

In practice, I fear the problem is often far more serious in published political polls. The academic studies tend to use quite neutrally worded, simple, straightforward statements. In the sort of political polling for pressure groups and campaigning groups that you see in real life the statements are often far more forcefully worded, and are often statements that justify or promote an opinion – below are some examples I’ve seen asked as agree-disagree statements in polls:

“The Brexit process has gone on long enough so MPs should back the Prime Minister’s deal and get it done”
“The result of the 2016 Referendum should be respected and there should be no second referendum”
“The government must enforce the minimum wage so we have a level playing field and employers can’t squeeze out British workers by employing immigrants on the cheap”

I don’t pick these because they are particularly bad (I’ve seen much worse), only to illustrate the difference. These are statements that are making an active argument in favour of an opinion, where the argument in the opposite direction is not being made. They do not give a reason why MPs may not want to back the Prime Minister’s deal, why a second referendum might be a good idea, why enforcing the minimum wage might be bad. It is easy to imagine that respondents might find these statements convincing… but that they might have found the opposite opinion just as convincing if they’d been presented with that. I would expect questions like this to produce a much larger bias in the direction of the statement if asked as an agree-disagree statement.

With a few exceptions I normally try to avoid running agree-disagree statements, but we ran some specially to illustrate the problems, splitting the sample so that one group of respondents were asked if they agreed or disagreed with a statement, and a second group where asked if they agreed-disagreed with a contrasting statement. As expected, it produces varied results.

For simple questions, like whether Theresa May is doing a good job, the difference is small (people disagreed with the statement that “Theresa May is doing a good job by 57% to 15% and agreed with the statement that “Theresa May is doing a bad job” by 52% to 18%. Almost a mirror image. On some of the other questions, the differences were stark:

  • If you asked if people agree that “The NHS needs reform more than it needs extra money” then people agree by 43% to 23%. However, if you ask if people agree with the opposite statement, that “The NHS needs extra money more than it needs reform”, then people also agree, by 53% to 20%.
  • If you ask if people agree or disagree that “NHS services should be tailored to the needs of populations in local areas, even if this means that there are differences across the country as a whole” than people agree by 43% to 18%. However, if you ask if they agree or disagree with a statement putting the opposite opinion – “NHS services should be the same across the country” – then people agree by 88% to 2%!
  • By 67% to 12% people agree with the statement that “Brexit is the most important issue facing the government and should be its top priority”. However, by 44% to 26% they also agree with the statement “There are more important issues that the government should be dealing with than Brexit”

I could go on – there are more results here (summary, full tabs) – but I hope the point is made. Agree/disagree statements appear to produce a consistent bias in favour of the statement, and while this can be minor in questions asking simple statements of opinion, if the statements amount to political arguments the scale of the bias can be huge.

A common suggested solution to this issue is to make sure that the statements in a survey are balanced, with an equal amount of statements in each direction. So, for example, if you were doing a survey about attitudes towards higher taxes, rather than asking people if they agreed or disagreed with ten statements in favour of high taxes, you’d ask if people agreed or disagreed with five statements in favour of higher taxes and five statements in favour of lower taxes.

This is certainly an improvement, but is still less than ideal. First it can produce contradictory results like the examples above. Secondly, in practice it can often result in some rather artificial and clunky sounding questions and double-negatives. Finally, in practice it is often difficult to make sure statements really are balanced (too often I have seen surveys that attempt a balanced statement grid, but where the statements in one direction are hard-hitting and compelling, and in the other direction are deliberately soft-balled or unappetising).

The better solution is not to ask them as agree-disagree statements at all. Change them into questions with specific answers – instead of asking if people agree that “Theresa May is going a good job”, ask if May is doing a good or bad job. Instead of asking if people agree that “The NHS needs reform more than it needs more money”, ask what people think the NHS needs more – reform or more money? Questions like the examples I gave above can easily be made better by pairing the contrasting statements, and asking which better reflects respondents views:

  • Asked to pick between the two statements on NHS reform or funding, 41% of people think it needs reform more, 43% think it needs extra money more.
  • Asked to pick between the two statements on NHS services, 36% think they should be tailored to local areas, 52% would prefer them to be the same across the whole country.
  • Asked to pick between the two statements on the importance of Brexit, 58% think it is the most important issue facing the government, 27% think there are more important issues the government should be dealing with instead.

So what does this mean when it comes to interpreting real polls?

The sad truth is that, despite the known problems with agree-disagree statements, they are far from uncommon. They are quick to ask, require almost no effort at all to script and are very easy for clients after a quick headline to interpret. And I fear there are some clients to whom the problems with bias are an advantage, not a obstacle; you often see them in polls commissioned by campaigning groups and pressure groups with a clear interest in getting a particular result.

Whenever judging a poll (and this goes to observers reading them, and journalists choosing whether to report them) my advice has always been to go to polling companies websites and look at the data tables – look at the actual numbers and the actual question wording. If the questions behind the headlines have been asked using agree-disagree statements, you should be sceptical. It’s a structure that does have an inherent bias, and does result in more people agreeing than if the question had been asked a different way.

Consider how the results may have been very different if the statement had been asked in the opposite direction. If it’s a good poll, you shouldn’t have to imagine that – the company should have made the effort to balance the poll by asking some of the statements in the opposite direction. If they haven’t made that effort, well, to me that rings some alarm bells.

If you get a poll that’s largely made up of agree-disagree statements, that are all worded in the direction that the client wants the respondent to answer rather than some in each direction, that use emotive and persuasive phrasing rather than bland and neutral wording? You would be right to be cautious.


There are two polls in this morning’s papers – Survation in the Mail and YouGov in the Times.

Survation have topline figures of CON 35%(-5), LAB 39%(+3), LDEM 10%(nc), UKIP 5%(nc). Fieldwork was on Friday, and changes are from mkid-February.
YouGov have topline figures of CON 35%(-5), LAB 31%(nc), LDEM 12%(+1), UKIP 6%(+3). Fieldwork was Thursday to Friday, and changes are from the start of March.

The overall leads are different, but that’s to be expected (Survation tend to produce figures that are better for Labour than most pollsters, YouGov tend to produce figures that are better for the Conservatives). The more interesting thing is what they have in common – both are showing a significant drop in Conservative support. As ever, it is worth waiting for other polls to show a similar trend before putting too much weight on it, but on first impressions it looks as though the ongoing chaos over Brexit may be starting to eat into Tory support.


-->

There were two polls in the Sunday papers. ComRes had a poll conducted for BrexitExpress (a pro-Brexit pressure group) prominently but poorly reported in the Sunday Telegraph. The voting intention question included The Independent Group as an option, producing topline figures of CON 36%(-2), LAB 34%(-3), LDEM 8%(-2), TIG 8%(+8), UKIP 6%(nc). Most polling companies are not, at present, including the Independent Group in polls – something that will presumably change as they take steps towards actually forming a party and clarifying their future intentions. The tables for the poll are here.

The Sunday Telegraph headlined on a finding that 44% of people agreed with a statement that “If the EU refuses to make any more concessions, the UK should leave without a deal”, suggesting rather more support for no deal than almost all other polls. I would advise a lot of scepticism here – agree/disagree statements are a rather suboptimal approach towards asking polling questions (I’ve written about them before here) that tend to produce a bias in the direction of the statement. The problem is they give only a single side of the argument – the question only asked people if they agreed with a statement supporting leaving with no deal in those circumstances. It did not offer people alternative options like a delay, or accepting the deal, or having a referendum. One can imagine that a poll asking “In the event that the EU does not agree to further changes to the deal, what do you think should happen?” would have produced rather different answers. Indeed, later on the survey asked which outcomes people thought would be best for the UK economy and best for UK democracy, which produced rather more typical results.

Note also that the Sunday Telegraph’s claim that the poll showed an increase in support for No Deal is not accurate – the poll back in January asked a differently worded question (it was structured as support/oppose, rather than an agree/disagree statement, and was in a grid along with other options) so they are not directly comparable.

As well as the ComRes poll there was a BMG poll hidden away in the Independent. The figures were unusually reported without excluding don’t knows or won’t votes, with the Conservatives on 31%, Labour on 27% and 8% for the Liberal Democrats. According to the Independent the Conservative lead is five points once don’t knows are excluded – that implies something along the lines of Con 40%, Lab 35% and Lib Dem 10% – though the full figures are yet to appear on the BMG website.


There are two new voting intention polls in the Sunday papers, tackling the issue of measuring TIG support in different ways…

Deltapoll for the Mail on Sunday have standard voting intentions of CON 43%, LAB 36%, LDEM 6%, UKIP 5%. Respondents were then asked how they would vote if The Independent Group put up candidates at the next election – voting intention under those circumstances switches to CON 39% (four points lower), LAB 31%(five points lower), TIF 11%, LDEM 5%(one point lower). The implication is that the Independent Group are taking some support from both Labour and Conservative though, as we saw with the YouGov poll earlier in the week, it’s not necessarily as simple as a direct transfer – part of the difference may well be people saying don’t know. Fieldwork was between Thurs and Saturday, full results are here.

Opinium for the Observer meanwhile only asked their standard voting intention question, but have begun including TIG in that. This flags up an interesting dilemma for polling companies. The Independent Group are obviously not a political party. While the widespread expectation is that at some point in the future they will become a political party, they aren’t registered as one yet, and aren’t putting up candidates yet. This means that most polling companies are asking hypothetical questions about the level of support they would get if they did stand, but are not currently including them in standard voting questions.

Opinium however are offering them as a current option – presumably their thinking is that it’s only a matter of time before they register and if poll respondents’ intention is already to vote for them when they do, they should register it. The approach Opinium has taken will clearly be the correct way to do it once the TIG do evolve into a political party, the question is whether it’s too early to do it now. Either way, for what it’s worth Opinium’s first polling figures with TIG included as an option are CON 40%(+3), LAB 32%(-5), LDEM 5%(-3), TIG 6%(+6), UKIP 7%(nc). Fieldwork was Wednesday to Friday, and changes are from a week ago. Full results are here.

To be complete, as well as the SkyData and Survation polls I’ve already written about here, which showed TIG support at 10% and 8% respectively, there was also a YouGov poll midweek. That found standard topline figures of CON 41%, LAB 33%, LDEM 10% and hypothetical figures of CON 38%, LAB 26%, LDEM 7%, TIG 14% (full write up is here. Overall that means, depending on the different questions asked and approaches taken, the initial level of support for the TIG seems to be between 6% and 14%.


Today we’ve had the first two polls asking people about whether they’d support The Independent Group were they to stand candidates.

Survation in the Daily Mail asked how people would vote if there was “a new centrist party opposed to Brexit”, producing voting intention figures of CON 39%, LAB 34%, LDEM 6%, “New centrist party” 8%, UKIP 5%. In comparison, the normal voting intention figures in the poll were CON 40%, LAB 36%, LDEM 10%, UKIP 5%, suggesting the new party could take support from both Labour and Conservative, though it would largely take votes from the Liberal Democrats. Tables are here.

SkyData, who do not typically publish voting intention figures, asked how people would vote if the “new Independent Group of former Labour MPs” were standing, and found voting intention figures of CON 32%, LAB 26%, TIG 10%, LDEM 9%, UKIP 6%. We don’t have standard voting intention figures to compare here, but on the face of it, it also looks as if support is coming from both Labour and Conservative, though the level of Lib Dem support appears to be holding up better than in the Survation poll. Note that the lower figures overall appear to be because of an unusually high figure for “others” (possibly because SkyData do not offer respondents the ability to answer don’t know). Tables are here.

These polls are, of course, still rather hypothetical. “The Independent Group” is not a political party yet (assuming, that it ever becomes one). It doesn’t formally have a leader yet, or any policies. We don’t yet know how it will co-exist with the Liberal Democrats. As of Tuesday night it only has former Labour MPs, though the rumourmill expects some Conservative MPs to join sooner rather than later.

Nevertheless, it is more “real” than the typical hypothetical polls asking about imaginary centrist parties. Respondents do at least have some names, faces and context to base it upon, and it gives us a baseline of support. We won’t really know for sure until (and unless) the Independent Group transform into a proper party and is just another option in standard voting intention polls.