Chris Dillow has a nice post up, “in praise of imprecision”. He argues that, in far too many situations, we argue over tiny differences in estimates when the overall answer is basically known. What’s GDP growth for last year? It’s basically flat. Yet for all the arguments, you would think that the difference between -0.3 and -0.1 per cent – or even between -0.1 and 0.1 – was the difference between life and death.
He illustrates this with a few neat little guesstimates. For instance:
How much does welfare scrounging cost the economy? Guesstimate the number of scroungers. Guesstimate the value-added they’d contribute if they were working. Express as a proportion of GDP. For plausible values, it’s a small number.
Or:
What impact will the small uprating in the minimum wage have on jobs? The adult rate will rise by 1.9%. Economists forecast inflation this year of 2.5%, so this is roughly a 0.6% real fall. Let’s call the price-elasticity of demand for labour 1.5. The Low Pay Commission estimates (pdf) that 5.3% of jobs are around minimum wage ones. Multiply these three numbers together and we get 0.048%. Multiply by the number of jobs in the economy (29.73m) and we have roughly 14,000.That’s roughly one-eleventh of the sampling variability of employment figures.
It’s worth pointing out that the same idea has been applied pretty consistently to the claim that families with three generations of worklessness are a public policy problem. We don’t know how many there are – and nor does the government, we now know – but study after study has suggested that the number is tiny.
There are only 15,000 households with two generations which have never worked, and a third of them are because the younger generation left full time education within the last year. On top of that, less than 1 per cent of young people have never worked by the age of 29, so the younger generation is normally the one most likely to pull a family out of worklessness. Whatever the number is, in other words, it’s really, really small.
But it’s important to note the downside to imprecision. The way common knowledge is disproved is rarely through wholesale upheaval. Instead, it’s a gradual process of refinement: new estimates are put out, slightly lower than the old ones; then lower estimates still; and they get steadily lower, until suddenly you realise that the conventional wisdom was wrong.
It’s a lot harder to turn an estimate of “recession” into an estimate of “growth” through gradual refinement than it is to turn an estimate of “-0.3 per cent” into one of “0.5 per cent”. So there’s more of a danger that we’ll be stuck with half-truths.
But with that danger in mind, the absence of accepted imprecision is still keenly felt in Whitehall. Too frequently, “no statistics” is used to imply “we have no idea of the magnitude of this problem” – but that’s not true. We actually know quite a lot, albeit imprecisely. The trick is acting on it.