New Times,
New Thinking.

  1. Politics
22 January 2016

Where do the pollsters go from here?

The polling inquiry has laid out the challenge - now pollsters  must meet it.

By Marcus Roberts

Pollsters across Britain have been trying to figure out what went wrong last May. Now the British Polling Council’s independent inquiry, led by the University of Southampton’s Professor Patrick Sturgis, has made it clear: we got our numbers wrong because we got our samples wrong. 

In response to this pollsters have a simple choice: pretend the problem never happened, blame the voters or take responsibility.

YouGov has chosen the latter course with full and unreserved apologies from the very top for the errors we made in 2015. 

So what was the main mistake? Our own internal investigation into our errors concurs with that of the independent inquiry: sample failure. Simply put, our pre-election samples contained too many politically-engaged young people and too few quieter pensioners. Since the election YouGov has spent hundreds of thousands of pounds on panel recruitment to correct this so as to maximise our accuracy in the future. 

Getting the makeup of the sample right, difficult and expensive as it is, is the best way to tackle the problem. 

Select and enter your email address Your weekly guide to the best writing on ideas, politics, books and culture every Saturday. The best way to sign up for The Saturday Read is via saturdayread.substack.com The New Statesman's quick and essential guide to the news and politics of the day. The best way to sign up for Morning Call is via morningcall.substack.com
Visit our privacy Policy for more information about our services, how Progressive Media Investments may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications.
THANK YOU

Other solutions that we considered and rejected included simply adjusting the figures from polling before publication to ensure a larger share of Conservative voters or adopting a probability sampling model for future polling. 

The problem with the first ‘solution’ is that while it may answer questions of voting intention with greater accuracy than the 2015 debacle it does not provide an accurate window into the opinions of those voters. What’s more, it can result in low income groups being underrepresented. And as psephologist Matt Singh, one of the few to warn of the polling problem in advance of the election, notes https://www.ncpolitics.uk/2015/11/new-ncp-analysis-where-the-polls-went-wrong.html/ this approach failed to save pollsters from embarrassment in 2015 as the shifts in voter behaviour were too complicated for weighting to accurately represent. 

Indeed the Sturgis inquiry itself has warned of the problems of so called ‘weighting’ solutions which may catch some problems but do not maximise the chance of accuracy. 

Next, is the suggestion based on Professor Curtis’s warm words for probability sampling that pollsters should ensure an accurate sample of the population through large scale, in-depth, face-to-face surveys. But this approach still has errors, the British Election Study  (a remarkable piece of scholarly work and a must read for politicos everywhere) still got the UKIP vote share wrong by several points. What’s more it can take two months or more to complete and cost hundreds of thousands of pounds per poll. Such an approach fails a pollster’s responsibility to provide accuracy as well as timeliness and indeed affordability – so that polling does not become the preserve of wealthy clients alone. 

The Sturgis inquiry has also raised the question of ‘herding’ in the polling industry. This makes little commercial sense as any polling company worth its salt desires differentiation in its results rather than similarity so as to demonstrate superiority. Simply put, we want our results to be different from our competitors so that, as we hope, when we are proved right they are proved wrong! 

Nevertheless, what is possible is that some companies may have adjusted methodologies during the course of the final days and weeks of the 2015 campaign which gave results more in keeping with the rest of the industry. This could explain the claims of some pollsters after the fact regretting that they didn’t publish more of their polls. But this was categorically not the case with YouGov – as the reams of data and cross-tabs publicly available make clear. 

The solution to the polling problem of 2015 is nevertheless hard and expensive: changing the make up of panels and samples to ensure they are an accurate reflection of the opinions and voting intentions of the British people all year round. 

The Sturgis Inquiry rightly diagnosed the problem in polling as being an inaccurate representation of the electorate. By making sure our samples include more people who pay little attention to politics and increasing the proportion of older voters in the sample, YouGov is addressing this. Answering the central charge of what we got wrong last May by changing our approach to that very error ensures that the lessons of 2015 are learned. And it is this approach that we believe will ensure accuracy in future that over time restores trust in our polling. 

Content from our partners
The Circular Economy: Green growth, jobs and resilience
Water security: is it a government priority?
Defend, deter, protect: the critical capabilities we rely on