New Times,
New Thinking.

  1. Politics
21 February 2020

“Better explanations translate into better predictions“ – Superforecasters author Philip Tetlock on the value of experts

Dominic Cummings used Tetlock's research to justify the hiring of Andrew Sabisky. What does Tetlock himself make of it? 

By Will Dunn

Dominic Cummings was asked, as he left for work on Tuesday morning, if he regretted giving the first of his “weirdo” jobs to Andrew Sabisky, a self-proclaimed “superforecaster” whose previous comments to the media included highly controversial views on eugenics and giving psychoactive drugs to all schoolchildren.

“Read Philip Tetlock’s Superforecasters,” responded the Prime Minister’s chief special adviser, “instead of political pundits who don’t know what they’re talking about.”

Philip E Tetlock is a psychologist who specialises in the field of political decision-making, and he is highly respected – Superforecasters carries a cover quote from no less an authority than Daniel Kahneman, the Nobel-winning father of behavioral economics. Tetlock’s work has shown that people who are experts in their subject often have their ability to make objective predictions clouded by the amount of information they have. His experiments, conducted over decades, have shown that well-informed people predict outcomes, on average, about as well as simple algorithms.

Cummings appears to extrapolate from this that political commentators “don’t know anything”, but Tetlock has shown that the predictions of experts are clouded precisely because they do have knowledge – they are so well-informed about the trees that they fail to see what’s going on in the wood. Tetlock’s work does not say that pundits are useless.

“Pundits do often make forecasts, though most are implicit,” he observes, but “pundits also do other things. They tell stories, make moral-political value judgments, frame issues, draw lessons from history.”

This is not the first time that people have drawn simplistic conclusions from Tetlock’s work. “Some people think that my earlier research program and academic book, Expert Political Judgment, settled that issue empirically: experts were not much more accurate than dart-tossing chimps”, he says. “But I never saw EPJ as a final answer. True, the data did show that expert performance was on average unimpressive… but there were marked individual differences among experts.”

Furthermore, he acknowledges that the “best pundits” might have had understandable reasons not to test their predictive powers in public. In his experience, he says, “the more famous the pundit, the more reluctant to run the risk of playing in a transparent forecasting tournament”.

Give a gift subscription to the New Statesman this Christmas from just £49

It was Tetlock’s Expert Political Judgement that Michael Gove invoked when he famously declared, in response to the warnings of economists, businesspeople and fellow lawmakers that Brexit would do severe and lasting damage to Britain’s economy, that “Britain has had enough of experts”. Gove, too, took Tetlock’s findings to imply that because experts cannot predict well, they are useless. But in his introduction to the 2017 edition of the book, Tetlock writes that “the media misinterpreted EPJ to claim that experts know nothing, and know-nothings seized on that claim as proof that knowledge itself is somehow useless”.

In reality, he says, the extent to which expertise and predictive ability can be connected remains highly debatable. He points out that for centuries, astronomers were able to make good predictions about the positions of the stars, but gave religious explanations that seem “absurd” to current science. Conversely, geophysicists today can give sophisticated explanations of plate tectonics, but often fail to predict earthquakes.

“In the long run,” however, he believes that “better explanations do eventually translate into better predictions – and better predictions are crucial for advancing science, technology, the economy [and] public policy”.

Tetlock says that his findings, if properly understood, could even help resolve the current tension between expertise and foresight in government. “The Tories,” he thinks, “may be open to experimentation at this juncture… because they worry that the expert communities in certain policy domains tend to oppose their party’s agenda.”

The forecasting tournaments that Tetlock uses to identify “superforecasters,” he says, “create transparency: they make it easier for consumers of expertise (in this case, politicians) to assess the predictive track records of those on whose advice they are relying. This should be of interest to leaders across a wide span of organizations – and across the ideological spectrum as well. It’s not just a Dom-thing or a Tory-thing; it is a rationality-101-thing. It should not be hard to imagine alternative histories in which a Labour government decides to embrace a methodology, such as forecasting tournaments, to improve signal-to-noise ratios in policymaking.”

“It would be unfortunate,” he concludes, “if forecasting tournaments or superforecasting came to be linked in the public mind with a particular political point of view. Regardless of ideology, surely we can all agree that we want our leaders to have ready access to the most accurate possible probability estimates of the consequences of the courses of action they are considering. After all, who wants to be led off a cliff?”

Content from our partners
Building Britain’s water security
How to solve the teaching crisis
Pitching in to support grassroots football