New Times,
New Thinking.

  1. World
10 November 2016

Why were the US polls wrong?

Donald Trump’s unexpected win was another blow for the polling industry. So what happened?

By Anthony wells

Donald Trump has won, so we have another round of stories about polling shortcomings, though thankfully it’s someone else’s country this time round (this is very much a personal take from across an ocean – the YouGov American and British teams are quite separate, so I have no insider angle on the YouGov American polls to offer).

A couple of weeks ago, I wrote about whether there was potential for the US polls to suffer the same sort of polling mishap as Britain had experienced in 2015. It now looks as if they have. The US polling industry actually has a very good record of accuracy – they obviously have a lot more contests to poll, a lot more information to hand (and probably a lot more money!), but nevertheless – if you put aside the 2000 exit poll, you have to go back to 1948 to find a complete polling catastrophe in the US. That expectation of accuracy means they’ll probably face a lot of flak in the days ahead.

We in Britain have, shall I say, more recent experience of the art of being wrong, so here’s what insight I can offer. First the Brexit comparison. I fear this will be almost universal over the next few weeks, but when it comes to polling it is questionable:

  • In the case of Brexit, the polling picture was mixed. Put crudely, telephone polls showed a clear lead for Remain, online polls showed a tight race, with Leave often ahead. Our media expected Remain to win and wrongly focused only on those polls that agreed with them, leading to a false narrative of a clear Remain lead, rather than a close-run thing. Some polls were wrong, but the perception that they were all off is wrong – it was a failure of interpretation.
  • In the case of the US, the polling picture was not really mixed. With the exception of the outlying USC Dornslife/LA Times poll, all the polls tended to show a picture of Clinton leading, backed up by state polls also showing Clinton leads consistent with the national polls. People were quite right to interpret the polls as showing Clinton heading towards victory…it was the polls themselves that were wrong.

How wrong were they? As I write, it looks as if Hillary Clinton will actually get the most votes, but lose in the Electoral College. In that sense, the national polls were not wrong when they showed Clinton ahead; she really was. It’s one of the most fustrating situations to be in as a pollster, those times when statistically you are correct, but your figures have told the wrong narrative, so everyone thinks you are wrong. That doesn’t let the American pollsters off the hook though: the final polls were clustered around a 4-point lead for Clinton, when in reality it looks to be about 1 point. More importantly, the state polls were often way out; polls had Ohio as a tight race when Trump stomped it by 8 points. All the polls in Wisconsin had Clinton clearly ahead; Trump won. Polls in Minnesota were showing Clinton leads of 5-10 points – it ended up on a knife edge. Clearly something went deeply wrong here.

Putting aside exactly how comparable the Brexit polls and the Trump polls are, there are some potential lessons in terms of polling methodology. I am no expert in US polling, so I’ll leave it to others more knowledgable than I to dig through the entrails of the election polls. However, based on my experiences of recent mishaps in British polling, there are a couple of places I would certainly start looking.

Select and enter your email address Your weekly guide to the best writing on ideas, politics, books and culture every Saturday. The best way to sign up for The Saturday Read is via saturdayread.substack.com The New Statesman's quick and essential guide to the news and politics of the day. The best way to sign up for Morning Call is via morningcall.substack.com
Visit our privacy Policy for more information about our services, how Progressive Media Investments may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications.
THANK YOU

One is turnout modelling – US pollsters often approach turnout in a very different way to how British pollsters traditionally did it. We’ve always relied on weighting to the profile of the whole population and asking people if they are likely to vote. US pollsters have access to far more information on which people actually do vote, allowing them to weight their samples to the profile of actual voters in a state. This has helped the normally good record of US pollsters…but carries a potential risk if the type of people who vote changes, if there is an unexpected increase in turnout among demographics who don’t usually vote. This was one of the ways British pollsters did get burnt over Brexit. After getting the 2015 election wrong, lots of British companies experimented with a more US-style approach, modelling turnout on the basis of people’s demographics. Those companies then faced problems when there was unexpectedly high turnout from more working-class, less well-educated voters at the referendum. Luckily for US pollsters, the relatively easy availability of data on who voted means they should be able to rule this in or out quite easily.

The second is sampling. The inquiry into our general election polling error in 2015 found that unrepresentative samples were the core of the problem, and I can well imagine that this is a problem that risks affecting pollsters anywhere. Across the world, landline penetration is falling, response rates are falling and it seems likely that the dwindling number of people still willing to take part in polls are ever more unrepresentative. In this country, our samples seemed to be skewed towards people who were too educated, who paid too much attention to politics, followed the news agenda and the political media too closely. We under-represented those with little interest in politics, and several UK pollsters have since started sampling and weighting by that to try and address the issue. Were the US pollsters to suffer a similar problem, one can easily imagine how it could result in polls under-representing Donald Trump’s support. If that does end up being the case, the question will be what US pollsters do to address the issue.

Anthony Wells is research director at YouGov. He tweets @anthonyjwells. This article was originally published on his website, UK Polling Report. It has been republished here with permission.

Content from our partners
The Circular Economy: Green growth, jobs and resilience
Water security: is it a government priority?
Defend, deter, protect: the critical capabilities we rely on