So there is some chatter going around that New Brunswick was yet another flop for pollsters in Canada, and in particular Forum Research, who released the final poll in the race that also happened to show a 40-40 tie between the two main parties (Libs and PCs) in the popular vote. That, clearly, did not happen. But was it really a miss, and can Forum be blamed? Lets dive into the facts.
My Projection Model
I included all polls since 2010 in my projection model, in the end I came up with this:
The fact is that, for an election barely polled, my projection didn't do too shabby at all. The Liberal and NDP numbers were essentially spot on, while the Greens were never expected to win a seat (except by NB Politico), and I made some bad assumptions about the Alliance vote. The main issue here was the PC vote, which rose dramatically for the final projection due to the aforementioned Forum poll.
Other projections from other people, such as 308.com, Canadian Election Atlas, and NB Politico, were essentially around the same mark. With no huge divergence among projections, and a pretty accurate reflection of the final result, is there really a problem here?
There isn't, but only because our projections have rolling averages that bring in all polls, rather than just one. The more polls added into the average, the better, as any weird results or wild swings are tempered by other polls. That is why we can add in a poll from a less-than-reputable company, as some claim Forum is, and not be too concerned if there is a lot of other pollsters also in the field.
As I had mentioned before, there were was barely any pollsters in the field. Here is a recount of the results since August:
Six polls is not much to go off, especially when it comes mostly from just two pollsters. However, in the end, the trends were all correct - the Liberal lead tightened, the PCs rose from the doldrums, and the NDP collapsed inward.
One could also make the argument that Forum helped correct an inaccuracy on CRA's part - an overestimation of Liberal support, something CRA actually had a problem with in 2010. Both overestimated PC support in their final polls, though CRA's was within its margin of error - Forum's was not. Both correctly called, within the margin of error, NDP and Green support.
Does this make the polls inaccurate? No more than usual. They are always to be taken with a grain of salt, and their data has to be treated for what it is - a snapshot in time subject to a margin of error, sampling bias, and a number of other factors.
That also means you can't bash Forum around as much as you'd like. Forum has never had a result when in the field with other pollsters that does not match up with what those other pollsters say - BC had every pollster show a large NDP lead; Alberta had every pollster show a Wildrose lead; Quebec 2012 and Quebec 2014, Forum followed the same trends and numbers as other pollsters. The one exception I can find so far has been Ontario's last election, though it wasn't as if Forum showed a completely out-of-this-world result, and had essentially every party within the margin of error (or, in the PCs case, just outside of it). Lets not forget all their many successes, including Toronto's 2014 mayoralty race, 2011 elections in Ontario, Nova Scotia 2013, and so on.
Forum is a popular punching bag for failures in Canadian polling, though that is mostly because of an overzealous President in Lorne Bozinoff, and the fact that they poll a lot more often and in more races than all other pollsters in Canada. No one else touches by-elections, mayoral races, federal elections, small provincial races, and so on - but Forum does. And if Forum gets one wrong, yet they appear to be the only pollster that does because they're the only pollster actually doing polling, everyone piles on.
I'm not here to defend Forum - I simply ask that if you wish to be taken seriously, you should remember that all pollsters are guilty of hits and misses. Forum remains no different.