Polls Are Usually Right, But They Can Go Wrong In Surprising Ways

Getty Images

If it’s an even-numbered year, people are probably obsessing about polls. That’s especially true this year, as a Presidential race unlike any in recent memory runs out the clock to November 8th. And, for every poll that’s released, there’s somebody insisting that it has to be wrong somehow. It’s skewed, it’s biased, the sample is too small, the litany of charges is nearly endless.

And sometimes, the polls really do get it wrong. Nor does it help that badly run “instant polls” are often pushed by candidates who prefer their version of events, to the point where news networks have banned their staff from even making reference to them. In truth, though, most polls are fairly accurate, at least insofar as they go. So why does polling work, and when does it go off the rails?

Polls, at root, are pretty simple. A polling company assembles a group of people at random, usually by auto-dialing phone numbers until somebody picks up or more recently by working from panels assembled online. Then they’ll ask for the demographic they’re looking for, or the person in the house with the most recent birthday, to ensure a random sample. Ideally, they manage to get a roughly demographic sample of the voting public. From there, pollsters extrapolate the electorate from there with custom statistical models, called “weighing,” which attempt to make up for demographic deficits and better reflect the overall electorate. In theory, at least, the more granular and precise the model is, and the bigger the sample of data you feed into, the more accurate your poll will be.

Polling is, however, necessarily an inexact science. Pollsters need to get people to sit down and fill out a survey online, talk on the phone for 15 minutes, or otherwise take a chunk of time out of their day, and people are increasingly unwilling to talk to pollsters for various reasons. Add to that the fact that human beings are flawed, and polls rely heavily both on the poll group’s self-perception and even how the poll is worded. A poll looking at the same thing but asking questions in two different ways can, by total accident, yield completely different results. And finally there’s the question of money, as the bigger a poll is, the more it costs to run.

Similarly, you often find “house effects” on polls, namely a lean towards the Democratic Party or the GOP that they attempt to compensate for. This is where attempts to “unskew” polls often come up, although “unskewed” polls are often little more than wishful thinking. All polls, however, will also have a margin of error, which is especially important. If a candidate is five points ahead in a poll, but that poll has a margin of error of five points, it’s too inexact to call.

Pollsters, of course, are fully aware of all this, and are working constantly to refine how they collect, and analyze, data. In fact, the best polls do this in public, allowing you to sort a poll worth listening to from a poll you should ignore.

Can I Trust This Poll?

There are untold legions of polls out there, and it can be difficult to sort what polls to trust versus what might be simple propaganda. There are, however, a few ways to spot credible pollsters. There’s no such thing as an absolutely unbiased poll, no matter how hard pollsters work and call each other out. We’re still talking fiddly, complicated statistical analysis. So instead, look for pollsters who are accountable and reveal this information to the public.

  • Do they disclose the size of their sample, and who it’s made up of?
  • Do they disclose how they got their interviews, and how many interviews they conducted?
  • Do they discuss their “house effect,” that is, how their polling model has trended in the past?
  • How high is their margin of error?
  • How recently was the poll conducted?
  • How closely do their results align with other polls being conducted?
  • And if they get it wrong, consistently, do they fix their model?

The last two are particularly important, because pollsters generally check their work against each other. If somebody’s results aren’t lining up, either they have a better methodology, which is possible but unlikely, or something has gone wrong with the poll. Similarly, accountability is often crucial if you’re going to trust a poll. Gallup, for example, disclosed how it was changing its methods when they incorrectly predicted Mitt Romney would win the 2012 election.

But still, for all that, polls can be wrong. 2016 alone saw two major surprises: Bernie Sanders winning, against all odds, the Michigan Democratic primary, and Brexit unexpectedly being approved by a narrow margin of British voters. So what happened? And why were these polls wrong when so many others were right?

How Polls Can Get It Wrong

The Sanders victory stands out in particular because of how wrong the polls got it: Clinton supposedly had a twenty-point lead in Michigan, and Sanders won it, albeit by a narrow 1.5% margin, instead. So, what happened? An enormous number of things at once, it turns out.

To start with, pollsters didn’t have a good sense of Michigan in the first place. Michigan’s 2008 Democratic primary was a complete disaster, and in fact, four candidates in the primary, including Obama, withdrew their names from consideration. In previous years, it ran caucuses, not primaries, so pollsters were essentially working blind. Furthermore, the polling samples were out of whack. There were nearly as many younger voters as older ones and more independent voters than expected, and many polls estimated that older Democratic voters would dominate the turnout. Plus, the Republican primary was going on at the same time, and some Hillary supporters cast their votes there instead.

Many people will point to Michigan when arguing the polls must have it wrong, but remember, a poll of a state is different from a national poll. After all, national polls have a long history, are run constantly by any number of pollsters, and polling samples are generally in line with demographics. But that leaves the sticky problem of Brexit.

That British voters wanted to leave the EU was a shock that took even gambling markets and financial experts by surprise. But there was, in fact, one polling company that got it right, Qriously, a company that uses surveys on mobile phones for its polls.

Qriously has drawn a lot of attention not just because they were correct about the Brexit vote, but because they’ve been sharply critical of how the British polls were conducted. They have reason. In 2015, British pollsters got the elections painfully wrong, expecting a Conservative win but not a majority, where instead the Tories won a slim majority. This wasn’t the first time it’d happened, either, as the same problem struck the polls in 1992.

In that situation, Qriously pointed out that once again, it was a number of factors. One was that the polls were relying on long, twenty-minute surveys to a fairly straightforward question, more likely to be answered by retirees with nothing to do, or people who signed up for online answer panels to get paid for their time. Qriously, which uses surveys on mobile phones to get answers, had a much larger and more diverse sample, although they’ll be the first to acknowledge that they were four points off in their Brexit poll.

Another factor Qriously doesn’t note, but that emerged after the fact, is that British voters may not have fully understood what they were voting for: Google found questions like “What is the EU?” and “What happens if we leave the EU?” spiked as search terms and at least some voters seem to have viewed it as a referendum on immigration rather than a complex policy decision. Furthermore, while young voters were overwhelming for Remain, the rising tide of conservatism and nationalism in Europe can’t be overlooked.

To extrapolate this Brexit example to the US — a country with a population far more diverse and nearly fivefold more than Britain — would be a mistake. For one thing, the sole national poll that’s shown Trump in the lead lately, is a bold experiment by the LA Times and USC to create a more granular poll, and it’s been completely thrown out of whack by one young Black man in Illinois who will be voting for Trump. But that’s a good reminder that no matter how hard pollsters work, and how much they agree with each other, the only poll that really matters is the one we participate in this November.

Vote Now

×