New Times,
New Thinking.

  1. Science & Tech
20 July 2021

“It’s the poorest in society who are being surveilled”: the rise of citizen-scoring algorithms

A new report reveals that millions of people across the UK are being profiled by welfare-focused algorithms without their knowledge.

By Oscar Williams

In November 2018, the UN special rapporteur Philip Alston published a damning report on the UK’s poverty crisis. “We are witnessing the gradual disappearance of the postwar British welfare state behind a webpage and an algorithm,” Alston warned. “In its place, a digital welfare state is emerging. The impact on the human rights of the most vulnerable in the UK will be immense.”

Over the subsequent two and a half years, journalists and civil liberties activists have sought to uncover the extent to which algorithmic risk-profiling now underpins the welfare state. A new report, published today by the campaign group Big Brother Watch, reveals that dozens of councils and social housing providers are routinely profiling millions of citizens without their knowledge.

According to the report, which is based on hundreds of freedom of information requests, councils have now carried out fraud risk profiling on more than half a million housing benefit or council tax support applicants. Some 1.6 million social housing residents have been profiled by commercial algorithms to assess their likelihood of paying rent. And more than a quarter of a million people have been subject to data harvesting to identify their risk of being abused, or becoming homeless or unemployed.

“It’s the poorest in society who are being surveilled,” says Jake Hurfurt, head of investigations at Big Brother Watch. “If you look at all these systems, it’s social housing, it’s Universal Credit, it’s trying to target people and profile them for financial vulnerability or vulnerability to other things.” Hurfurt says that the campaign group is “very, very concerned” that the government is “creating essentially a surveillance state, where as a condition for state support, you have to undergo this kind of monitoring and profiling”.

By sowing distrust of welfare claimants and stripping councils of already limited resources, austerity has created the conditions for this technology to flourish. A patchwork of providers have sought to capitalise on councils’ need to cut costs. Technology produced by Xantura, a major provider, can reportedly save a local authority £127,000 over the course of a year.

[See also: How citizen-scoring algorithms are being used by local government in the UK​]

In recent years a number of councils have adopted Xantura’s OneView product. The predictive analytics software draws upon a range of structured and unstructured data to assess a household’s risk of debt or homelessness, as well as harm related to Covid-19. “A proposal document from Shropshire Council suggested that even details of sexual habits and anger management could be gleaned in case notes,” Big Brother Watch found. 

Give a gift subscription to the New Statesman this Christmas from just £49

But while the tool harvests huge amounts of data, those who are subject to its analysis are unlikely to know about it. A data protection assessment carried out by Thurrock Council revealed it hadn’t sought individuals’ consent before analysing the data for this purpose.

Hurfurt says it was difficult to find individuals who knew they had been profiled by local authorities’ risk-scoring algorithms. But one woman, who had received housing support, told Big Brother Watch she was “disgusted” to have learnt that her data had been used in this way.

“I’ve noticed the amount of evidence I’ve been asked for has changed over the years, which makes it really stressful. I’ve been made to go through all my bank statements line by line with an assessor, which made me feel like a criminal. Now I wonder if it’s because a machine decided, for reasons unknown, I could be a fraudster. It feels very unjust for people like me in genuine need, to know I’m being scrutinised and not believed over evidence I provide.”

Even local authorities appear to have an incomplete understanding of how the technology works, having passed some of Big Brother Watch’s questions to providers. Hurfurt says it is “very much happening in the shadows and this is something we’re concerned about. You often only know you’ve been affected by this when it’s too late. The councils need to be much more transparent about what they’re doing with people’s data.”

[See also: How Universal Credit is one of the least generous welfare systems in Europe]

Over the last four years, the number of councils using predictive analytics to assess benefits claimants’ fraud risk has dropped from around 80 to 50, according to the research. One council said the technology had “yet to realise most of its predicted financial efficiencies”; the introduction of Universal Credit has also made the technology less financially appealing. But Hurfurt believes that another factor is at play too: a decision by the government to centralise the analysis. 

“The Department for Work and Pensions is running centralised housing benefit fraud risk-scoring,” Hurfurt said. “So, the risk-based verification stuff is when you apply for benefits; the DWP is now risk-scoring everyone who receives housing benefit, and forcing councils to do full case reviews on the riskiest 400,000 people.”

Despite the drop in fraud risk-scoring at a local level, Hurfurt says that the rising popularity of other forms of predictive analytics, such as assessing individuals’ risk of harm, means that the number of councils using some kind of algorithmic profiling has remained at the same level.

Big Brother Watch is now calling for the Information Commissioner’s Office to introduce a public register of algorithms used in public sector decision-making. “[Councils] wade through all sorts of quite dodgy justifications, whereas if you have to be public about it and have a register, at least, that would be open to question and open to challenge,” says Hurfurt. “At the moment nobody really knows what’s going on.”

[See also: Poverty by algorithm: The Universal Credit design flaw that leaves people penniless]

Content from our partners
Building Britain’s water security
How to solve the teaching crisis
Pitching in to support grassroots football