Sunday, October 10, 2010

Lib Dems and pollsters: what might have went wrong, where, and why

UK Polling Report did a nice, easy-to-read post on why polling in the UK general election this past May went so horribly wrong on estimating Lib Dem support, at the expense of both Conservative and Labour parties.

Here are some excerpts:
.... All of the companies who released polls within a few days of the general election overstated the level of Lib Dem support, by between 2 and 5 points, but the reasons are still unclear, and will probably remain so until pollsters get to test their methods against the next general election. Basically the possible reasons boil down to reasons of people changing their intentions, pollsters misjudging their intentions, or pollsters polling unrepresentative groups of people.

.... Final polls strongly suggested that the Lib Dem poll was soft – several companies included questions on whether people might still change their mind and found Lib Dem voters were still unsure. However, just because people are uncertain doesn’t mean they necessarily will change their mind. If there had been a late swing away from the Lib Dems, then it should have been picked up by recontact surveys after the election. Angus Reid, YouGov and ICM all recontacted people who were polled late in the campaign to see if people who told them they would vote Lib Dem changed their minds after the final polls – all found neglible levels of late swing.

... Clearly the reason for the error could have been don’t knows breaking disproportionately for Labour and Conservatives. ICM’s recontact survey found don’t knows broke disproportionately for Labour and their topline adjustment made their figures more accurate, MORI’s squeeze question also boosted Labour – so it would seem the pollsters got this one right. YouGov don’t use any reallocation of don’t knows – but their re-contact survey showed don’t knows splitted pretty evenly (there are likely to be different patterns of saying don’t know when there is no human interviewer).
I highly recommend you go over and take a look for yourself, as it also speaks a lot towards some of the methods that pollsters here in Canada use. Given that in the last few days of the 2008, the majority of pollsters were a bit off as well (some also overestimated NDP support), it's interesting to find the correlation in numbers, not to mention that some of the companies over there are the same ones we have here, more or less.

Also - for those that read this blog and are interested in UK politics, to the right I stole another thing off of UK Polling Report, and added YouGov's daily poll widget. Easy enough to figure out: "C0n" is the governing Conservative Party led by PM David Cameron; "Lab" is the Opposition Labour Party, led by Ed Milliband, who I've talked about here; "Lib" is the Liberal Democrats, led by Nick Clegg and in a governing coalition with the Conservatives; and "Oth" is the "other" parties, which include the Northern Ireland parties, the Celtic nationalist parties, and others like the Green Party and UKIP.

If anyone knows of a similar widget for Canadian politics, I'd love to have that up as well!


  1. Every pollster except Angus under polled the CPC?

    Many of them over polled the Liberals, Dem, Green vote.

    Demographics over 70% turnout for 65+ which party has that group?

    Which demographic is getting bigger in the next five-ten years?

    What is the life expectancy of the 65+ group?

    Canada tilting right (Conservative)compliments of demographics+immigration.

  2. Angus was actually very accurate with their results in regard to both the CPC and LPC last time, with 37-27 being close to the actual results (37.6-26.4), off by more or less one point. And unlike others, they managed to nail down Green support as well. They also, like other pollsters, overestimated NDP support. All in all, they had a good day on October 14th.

    But they did not on May 6, 2010. Their last results, using the same methodology (which they've tweeked since 2008), was 36-29-24, Con-Lib-Lab. Way, way off. And since 2008, they've done this several times in several jurisdictions, which makes me suspicious of any new methodology they came out with.

    You can say that Angus Reid fell for the trap others did, but it's consistent in their pollings of recent, or at least since I've been taking note since the beginning of 2009, that their polls seem to buck trends consistently, unlike 2008 when they were more or less in line with everyone else, if not a little more accurate on/near polling day if not for the lack of large Green numbers. Even on that UK example, they had Labour much, much lower than anyone else on polling day. That's disconcerting, given that they were accurate on Oct. 14th in terms of Liberal support.

    The only way to tell, really, is the next election. If Angus Reid changed methodology or weighting, and the result is more inaccuracy, we'll see that. I like to hedge my bets based on their recent polling, not polling done 2 years ago, eh.