New Times,
New Thinking.

  1. Science & Tech
15 July 2019updated 17 Jan 2024 5:58am

Revealed: how citizen-scoring algorithms are being used by local government in the UK

After nearly a decade of austerity, councils are turning to algorithms to predict the health, income and behaviour of their constituents.

By Hettie O'Brien

Local councils in the UK last year spent £2m on contracts with the credit-scoring company Experian, as part of an effort to categorise, assess and profile segments of the population, the New Statesman has learned.

According to a paper published today by the Data Justice Lab in Cardiff, many councils are now using similar services to apply algorithms to public data in order to segment and “score” citizens and population groups according to their social group or “risk profile”.  

Amid mounting financial pressures and cuts to local authority budgets, councils are using “predictive analytics” systems to algorithmically categorise citizens at both the individual and population level. An array of policies can then be targeted at specific groups or individuals. Researchers found a total of 53 councils purchasing data systems from private companies in order to classify citizens and predict future outcomes, but the actual number could be far higher.

Mosaic is a demographic classification tool produced by Experian, which collates thousands of data points from the online activity and shopping habits of consumers. The company’s description of Mosaic suggests that it can provide data about individual households, which councils can combine with their own data to develop a clear picture of who lives in the property, their financial status and other information about their lives.

In a brochure published on Experian’s website, the company says Mosaic can “peer inside” to reveal “all the different types of households”, with information including their “various life-stages, marital status, household compositions and financial positions”. It claims to be able to provide a “pin-sharp” picture of the UK consumer, offering 850 million pieces of information across 450 different data points. 

The tool separates households in a colour-coded grid, with cooler colours indicating better-off households and hotter colours indicating less well-off households. In the elite categories, Mosaic identifies “metro high-flyers”, “diamond days” and the “uptown elite”. The less well-off categories include “bus-route renters”, “flexible workforce” and “ageing access”. 

The New Statesman found local councils using Mosaic for a number of purposes. In Kent, Mosaic data is used to assign “propensity scores” covering socio-economic issues to map health inequalities and inform commissioning decisions. Nottingham’s public health department uses Mosaic to map the risk of alcohol harm, while Southend-on-Sea’s borough council, which did not respond to a request for comment, contracted data from Mosaic to help decide where to target resources and public services.

Give a gift subscription to the New Statesman this Christmas from just £49

When contacted for information about these contracts, Experian said that its Mosaic system takes information “from a range of publicly and commercially available sources” and helps organisations “understand consumer groups and neighbourhoods according to their likely demographics, lifestyles and purchasing behaviour.”

Experian is not the only company helping councils to profile citizens. The Data Justice Lab has found a number of other private companies supplying predictive algorithms to the public sector, including Xantura, which provides data-sharing services to several councils across the UK and won four deals worth a total of more than £300,000 in 2018, according to analysis shared with the New Statesman by Tussell. In the London Boroughs of Hackney, Newnham, and Tower Hamlets, and in Thurrock, the company has trialled an “early help profiling system” that translates data on families into risk profiles in order to predict which children are at risk of neglect or abuse. The data includes information on school attendance and exclusion, police records on antisocial behaviour, domestic violence, and housing association repairs and arrears.

Xantura can reportedly save a council £127,000 annually by replacing human oversight with an automated system. With councils under increasing pressure to cut costs, some feel that the tools can be a “more efficient way of working and targeting resources”, and give councils a more “granular understanding of vulnerability, Lina Dencik, one of the report’s authors, told the New Statesman.

But the deployment of tools such as Mosaic and Xantura also raises questions about algorithmic stereotyping and the invasion of privacy.

Paul Burnham, an activist with the Defend Council Housing group in Haringey, north London, first encountered Mosaic when he was campaigning against a controversial housing development. The developer, Lendlease, and Haringey Council made use of Mosaic data to profile populations living in the proposed redevelopment area.

“Their description of a council estate was an utter stereotype”, Burnham says over the phone from his tenth-floor council flat. “There’s also a deeper problem that Experian [Mosaic] is basically a marketing tool used to target and segment your market based on who has disposable income – that is then used as part of delivering a public policy that is supposed to be inclusive”. 

Burnham was also frustrated by the “black box” nature of the technology. “If data is being used to shape public policy, it ought to be in the public domain”, he adds.

Algorithmic profiling is most controversial when used in policing. Durham’s police constabulary used Mosaic data to inform its Harm Assessment Risk Tool (HART), which helped assess whether suspects were at a low, moderate or high risk of reoffending. According to campaign group Big Brother Watch, police data was used to classify people into groups including “disconnected youth”, “Asian heritage” and “dependent greys”. The police force dropped its profiling tool last month after sustained pressure from the campaign group. 

Duncan McCann, a researcher at the New Economics Foundation, says this kind of data collection is inherently problematic, and that data is over-collected and under-protected. “On the face of it,” he says, “it could be preferable to entrust this kind of data to the public sector rather than the private sector, but in reality this doesn’t mitigate any of the problems”. 

Previous research from the Data Justice Lab found that Avon & Somerset Police was using an analytics tool called Qlik Sense for predictive policing. The system produces percentage risk scores that identify the likelihood of individuals offending or becoming victims of a crime. The system was implemented as a cost-saving measure at a time when Avon & Somerset Constabulary faced £60m in budget cuts. 

The most well-known example of the “scoring” of citizens exists in China, where a nationwide “social credit” system is in development. The system will automatically punish citizens for infringements of the law that fall short of being considered criminal activity – such as the failure to repay debts. Punishments include restrictions on travel, exclusion from certain jobs, reduced internet access, and exclusion of offenders’ children from certain schools.  

Many in the West have described China’s social credit-scoring system as totalitarian. But Jeremy Daum, a legal scholar at Yale Law School’s China Centre, has said that coverage of the system is “a way of discussing our own situation from a safe distance”.

Duncan McCann says the parallels with China are now too clear to ignore. “The same infrastructure is being built up. It’s just at that final point of usage that we see slight variations.”

Content from our partners
Building Britain’s water security
How to solve the teaching crisis
Pitching in to support grassroots football