Could the polls be wrong? That’s a question I’ve been asked often in recent months, mostly by Democrats hoping that the dire forecasts produced here and elsewhere turn out to be too pessimistic. The short answer is of course they can. In an era of low response rates, imperfect sample coverage and a host of new polling technologies, nothing is certain. At this hour however, the most likely range of that error lies somewhere between a Democratic defeat comparable to 1994 and something much more severe.
Before we review our final round of polling forecasts based on whatever final polls straggle in this morning, let’s take a few minutes to ponder that question a little more carefully. Every polling average on HuffPost Pollster and our Election Dashboard, and every probability we have calculated on the outcome, rests on the assumption that, as a whole, the underlying polling is statistically unbiased.
Are pre-election polls often widely variable? Yes, but averages and the “trend estimate” numbers we publish assume that most of the variation, whether based the errors involved in measuring a sample rather than the entire population, or based on subjective survey design decisions that pollsters make, from question wording to identifying the likely electorate, is random. If we average out the various polls, we minimize that random error and get a more accurate forecast. At least that’s the operating assumption.
Polling aggregators — from Bill Schneider and his CNN “poll of polls” in the 1990s to the RealClearPolitics averages, to sites like Pollster.com and FiveThirtyEight — have succeeded because on the whole, pre-election polls over the last 15 to 20 years have been largely unbiased. If they have remained statistically neutral in recent weeks, then the estimates we have produced will be reasonably accurate.