Tuesday, April 24, 2012

#AbVote - Poll and Projection Analysis

So last night wasn't the greatest endorsement of my projection system ever. I had predicted, at the very least, a Wildrose government, with a hefty PC opposition. This was based on every single poll, including the famous last-minute Forum poll, up until April 23rd.

So, indeed, let us deal with the polls first. What happened? The polls clearly showed a narrowing gap since the debate, represented by my polling average in the graph below. However, the gap never closed all that - even the Forum poll only moved the average a percentage point. Let's make it clear as well: Forum only had the one poll, and one poll does not a result make - in most cases. While Forum got it wrong, they definitely showed a severe swing. How was anyone supposed to know they were right, compared to the 8-10 point leads for Wildrose in other polls?

So how did the polls get it so wrong? Some have speculated that because roughly 20% of respondents were undecided in the final days, and those undecideds moved en masse towards the PCs. A very valid assumption, but possibly not one that accounts for everything. I think a lot of undecideds did break for the PCs, maybe enough to put them over the top, but strategic voting did a lot more to help the PCs win.

Ah yes, strategic voting. We all knew that the Liberals were going to be facing a drop this election, but the polls overestimated Liberal support by one-to-four points in the last days, with the NDP also overestimated by roughly the same amount. In Calgary, however, is where it really played out; roughly 2/3rd of the Liberal vote fled to the PCs, while the NDP stayed at their paltry 5% of the vote. No pollster had shown the eventual spread in Calgary (roughly 46% to 35%) that resulted because of strategic voting. Amazing, really.

And because of the polls, my projection obviously failed. My prediction of a Wildrose victory proved wrong, because I had the Wildrosers at 40% and the PCs at 34%, which resulted in 52 WRP and 30 PCs.

Of the ridings I called, under half were correct, and this almost became exclusive to Calgary, where I only got 2 of 18 called ridings correct - simply because the Wildrose sweep there never materialized. However, the saving grace of the projection was that I got most marginal ridings correct (correct as in, I predicted one or the other party would win it), culminating in a 57.5% success rate for ridings in my projection. Not good, but not bad considered how wrong I was.

But what if the model had the correct numbers? While I would've been a lot closer to the end result (my model would've gotten 66 PCs, 18 Wildrosers, and 3 New Democrats), I still would've managed to get almost a third of my riding calls wrong.

This is, again, because of Calgary. What was among the Wildroser's best ridings in 2008, where just middling ridings in 2012. But the number of called ridings is significantly better. And I would've had roughly the right number of ridings, plus or minus a few Liberals (hard to predict those ridings, given that the five incumbents who survived were trend-defying islands).

While I can't call it a success, I can say that the model, had it the right numbers, would've held up fairly well, even though, unlike 308.com's, it wasn't a true regional-based model. In the future, however, I'll need to give incumbents more of an advantage, maybe Liberals in particular - so far we've seen Newfoundland, Ontario, the federal, and Alberta Liberals maintain a lot of their incumbents really well.

Here's the official comparison for posterity:

Projection - 52 Wildrose, 30 PC, 5 NDP (40.2% Wildrose, 34.2% PC, 12.4% Lib, 11.3% NDP)
Actual - 61 PC, 17 Wildrose, 5 Lib, 4 NDP (43.9% PC, 34.3% Wildrose, 9.9% Lib, 9.8% NDP)
Actual w/Projection - 66 PC, 18 Wildrose, 3 NDP

No comments:

Post a Comment