Both Mike Smithson and I regularly criticise the BBC’s Daily Politics for commissioning political polls that are not politically weighted. The vast majority of political polls include a voting intention question and (with the exception of MORI, who don’t believe past vote is suitable for weighting in the first place) these are always weighted to be representative politically as well as demographically, normally with reference to how people claim they voted back in 2005.

In contrast, the BBC invariably commission polls without voting intention questions (since their producer guidelines say they need special permission to do so) and these are invariably NOT politically weighted, presumably because past vote weighting requires an extra question and therefore cost more. Does this matter, does it make a difference? Well, for this week’s Daily Politics ComRes asked exactly the same question in their non-politically weighted poll as they did last week in a politically weighted poll for the Indy.

In the ComRes poll for the Independent, which WAS weighted by past vote, Brown & Darling were most trusted by 35%, Cameron & Osborne were most trusted by 33%, Clegg & Cable by 7%.

In the ComRes poll for the BBC’s Daily Politics, which WASN’T weighted by past vote, Brown & Darling were most trusted by 32%, Cameron & Osborne were most trusted by 23%, Clegg & Cable by 4%.

As you can see, while Labour are much the same, there is a huge 10 point difference between the two Conservative scores. It’s possible, of course, that confidence in Cameron & Osborne has massively slumped in the 7 days between the two sets of fieldwork, but this doesn’t seem likely given subsequent national polls haven’t shown this drastic slump. In fact I’m certain it’s because one was politically weighted and one wasn’t. Past vote weighting of phone polls invariably involves making the sample more Conservative and less Labour and the cross-breaks of the ComRes poll for the Indy shows that answers to this question are very closely correlated with voting intention – 72% of Tory voters answer Cameron & Osborne and 80% of Labour voters answer Brown & Darling.

If one believes that political weighting is necessary to get a representative sample for voting intention, then it should also be necessary for other polls where people are asked to compare the political parties or where answers are likely to be aligned to party loyalty.

UPDATE: Andrew Hawkins of ComRes has asked me to link to his comparison a month or two back of a poll he did when political weighting did not make a difference, and I happy to do so here. He has also let me that political weighting made a 2 point difference to the question when it was asked in the Independent (I’m not sure if that is the Conservatives 2 points higher, or the lead 2 points different, or whatever). Presumably recalled past vote was not asked for the BBC version of the question, so we can’t tell how much difference that would have made to the answer.

17 Responses to “The difference political weighting makes”

  1. Anthony,

    I’m sure your analysis is correct, but this doesn’t mean that research which is not designed to predict voting behaviour (and I have never watched Daily Politics, so I don’t know how they present it) should be weighted so that the population reflects the profile of those voting (or not voting) in the population.

    I accept what you say about the result to such a question likely to be aligned to party loyalty, but it seems to me a perfectly legimate approach to define your population by the ‘usual’ demographics (for weighting purposes), and not using past vote.

    For instance, if you asked a question on fox hunting should you also apply a past vote weight? You could argue that your attitudes may be correlated with party loyalty, but it’s not at all clear that you would still apply political weighting as all you’re doing is measuring attitudes in the population at large. How this translates into voting intention is something quite seperate.

    I’m not saying that applying a past vote is right or wrong, but that it’s legitimate to define your population without reference to past voting behaviour if you’re not trying to predict future voting behaviour.

  2. If your poll is trying to represent the British population, then it should in a perfect world accurately reflect the population in every way. A sample should be as close to an actual microcosm of the population as humanly possible.

    In reality however, you don’t need it to be that perfect. Theoreticaly a truly representative sample would have, for example, the correct proportion of left handed people, or people with ginger hair. However, I can’t imagine many questions where left-handedness or gingerness would make a difference, so there is no need to worry if the sample isn’t right. I would suspect that a sample that was actually weighted by left-handedness would give you exactly the same answers as one that wasn’t. There is also no reason to think that a normal sample would have any sort of intrinsic bias on left-handedness that would require weighting to correct. It becomes necessary when it makes a difference.

    To take your example, a sample of the population at large should be politically representative if it is a good sample. If it is not politically representative, then it is not a good sample of the population at large, it’s a politically biased one.

    Now, that bias may not matter, if you are asking about soap powder there’s not likely to be any correlation, a politically weighted sample should give the same result as a non politically weighted sample, so while it might theoretically make is a more representative sample, weighting it that way would be a waste of time and money. In anything touching on politics though, even away from actual voting intentions, political viewpoint is likely to make a difference.

    If weighting something to match a known, accurate target makes a difference to the answer, then not weighting by it means your sample is not accurately reflecting reality. In short, if weighting by past vote makes a difference, it should be done. If it doesn’t make a difference, then it is unnecessary.

    (All of this, of course, is assuming you think past vote weighting is legitimate way to do political weighting in the first place. If you think past vote weighting leads to a more representative sample for voting intention, it follows it will do the same for other questions. If like MORI you think it’s not suitable and there isn’t a good way to do political weighting, then you’d think it was a bad idea across the board).

  3. The question arises as to whether the BBC policy is intentional.

    1) There is reasonable cause to suspect that the BBC is institutionally pro-Labour (as has been stated by many BBC broadcasters & executives).

    2) The BBC chooses to use a polling approach which overstates the support for Labour.

    Am I alone in considering that the justification for 2 may be found in 1?

  4. Weighting is just another term for excuses for vote fixing.

  5. Thanks very much for your reply.

    I suppose my ambivalence towards policitcal weighting as an approach means I can see how you could construct an argument for both methods giving representative outcomes.

    But I fully accept your argument about consistency of approach.

  6. In the ComRes poll for the BBC’s Daily Politics, which WASN’T weighted by past vote, Brown & Darling were most trusted by 32%, Cameron & Osborne were most trusted by 23%, Clegg & Cable by 4%.

    So let us get this straight. Labour voters trust Rusty (and sock-puppet), and everyone else are unsure about anyone else sorting out the economy….?

    Given McBean’s legacy, you consider this a good poll for nEU Labour? Why…?

    I assume you are just waiting for the IPSOS poll about the economic outlook (as mentioned by Mr this morning)? If so, do what we all do: wait for the latest polling information, digest it and then comment upon it – assuming that is worth it.

    I understand that the next fifteen-months look pretty dull – Tory land-slide and all – but can we be surgical in our comments? Let’s look at polls not for trends, methodology or financing-interest, but for the peculiarities held within. The rest we should leave to Mike’s playpen…! :P

  7. Cyberkast

    To take your example of Fox-Hunting, for the sample to be representative, it would need to have a correct mix of Townies, suburbanites / country folk. Otherwise, one could eaily prodiuce a skewed result in either direction.

    The Beeb no doubt would opt for a metropolitan poll since that would in any case be cheaper to produce.

    I suspect that Cynosarges (no relation to cyberkast ?) makes a valid point about the Beeb’s preference for polls which support its own pre-conceptions.


  8. BBC coverage is governed not by any bias institutional or otherwise but by the flow of information. In the Tory years Labour politicians and supporters were never done complaining about the Thatcherite BBC particularly with regards to things like the the Falklands and the miners strike.

    Since Labour took power it’s been the Tories that have been complaining.

    For my party there was no end of people crying foul for years about the Unionist plots of the London BBC, but that has subdued since 2007 when we became the Government.

    The BBC follows the news and it’s narrative and a government makes the news and is part of the narrative. A good government or at least one with good PR can set and control the narrative and with it ,with luck, the way the media represents it.

    I wouldn’t go as far as saying the BBC is an innocent victim, but for me what we are seeing isn’t pro party bias but pro government bias both north and south of the border.

    That or paranoid Tories seeing reds under every bed.


    What’s your take on the BBC;

    Flawed professionals, dupes for the government or closet trots.


  9. Paul,

    Thanks for your post. I understand the need for a sample to be representative and for the need for weighting (it happens to be my day job).

    I suspect the controversy about political weighting is that you’re weighting to something which is in itself difficult to accurately measure, i.e. past vote. (It can be difficult enough to get respondents to accurately tell you what they did yesterday, let alone 4 years ago). That said, anything which might reduce bias is generally a good thing. Hence my ambivalence.

    Whether someone lives in a rural or urban area is relatively straightforward to measure and geographical spread is usually one of the first things that is controlled for in an opinion poll in any case.

    I’m not a subscriber to the view that the BBC deliberately commissions polls that will give left wing answers. But if they (or anyone else) did – it would be the responsibility of ComRes, being members of the MRS (I assume) and BRC (I know), to ensure that the research results given were obtained using a sound methodology – including weighting.

  10. It’s not related to this poll

    Just did the next ‘random’ YOUGOV’ poll, not the brands one; ALL the options went from a ‘best’ to ‘worst’ situation for every question. It ought to have been randomised, the results from that survey will obviously be biased to the options in the first three choices. I could see what happened and I got bored reading the options.

    Seriously, a very bad-in fact laughable-survey. It’s worthy of BAD SCIENCE in the Guardian or similar. Seriously YOUGOV should have done better (it’s like a really bad one from some years ago which asked do I drink Coca Cola or Pepsi. No 3rd option).

    Somewhat annoyed- and a useful reminder that YOUGOV is human…

  11. Vince Cable and Nick Clegg on 4% as most trusted to run the economy.

    Amazing when you consider that almost all politicians of all parties and every political pundit acknowledge that Vince Cable is the Most Capable politician on the Economy.

    Does this poll have any credibility?

  12. @ Cyberkast – as a country bumpkin who is opposed to fox-hunting, I recall at least one poll being done some years ago which found that opposition to fox-hunting was more or less at the same levels in both town and country. Which raises the issue of whether some “obvious” varieties of weighting are really at all useful, since in this instance they relied on assumptions based on stereotypes that turned out to be inaccurate.

    I think political weighting is useful in political polls but where it’s based on assumptions about particular demographic groups (Anthony’s examples of left-handers and gingerheads, or “townspeople vs. countryfolk”) it can be utterly pointless and says more about the pollsters’ assumptions than the opinions of those being polled.

  13. @Paul H-J

    No relation to Cyberkarst

    The Cynosarges was the gymnasium outside Athens where Antisthenes & Diogenes developed the teachings of Socrates into the philosophy of Cynicism. Given the incompetence and corruption evident throughout politicians and the nomenkultura that afflict this country, I felt the birthplace of cynicism was a good nickname to use in my postings about the political scene.

  14. Good article, Anthony. I had always wondered why the Daily Politics present poll results which are so consistently different from official national polls. It begs the question of whether the producers of the show are actually aware of this because the presenters never seem to acknowledge that there does seem to be a difference between national trends and their own polls. It’s usually left to the guests to mention that the results are slightly different to similar polls during the same time period.

  15. Am I wrong or do not BBC News (or at least their News website) have a policy – introduced around the time of the slide away from Labour in 2007 – of not reporting political opinion poll results?

    Putting that aside, from a statistical point of view I’m surprised at the extent to which professional pollsters are using past voting responses as a weighting factor – but impressed by the extent to which it does actually appear to work – at least in predicting future voting.

    Perhaps the point is that with any sample, if you can find a relevant prior benchmark against which to test the sample you can re-weigh the sample to make it more representative – but past voting may not be as relevant to polls about other questions?

  16. New YouGov/Sunday Times Scottish voting intention poll!! (rare event)

    Sample size = 1500

    Here are the changes compared to the YouGov/Sunday Times 22-24 October 2008:

    SNP 38% (-1)
    Lab 32% (+1)
    Con 13% (-1)
    Lib 12% (nc)

    SNP 34% (+2)
    Lab 28% (-1)
    Con 15% (-1)
    Lib 11% (nc)
    Greens 6% (nc)
    Scottish Socialists 4% (nc)
    Solidarity 1% (nc)
    Others 2% (nc)

    Lab 37% (-1), SNP 27% (-2)

  17. post in moderation Anthony?