New Times,
New Thinking.

Is facial recognition tech in public spaces ever justified?

The food store chain Southern Co-op faces a legal case over its use of surveillance technology.

By Sarah Dawood

Facial recognition technology is used willingly by millions of us every day as a form of authentication, to unlock our phones, generate passwords we’ve forgotten and pay for things.

But consent is key. If we happily hand our biometric data over to the likes of Apple and Samsung, it is based on the assumption that our data is encrypted, stored safely on our devices and accessible only to us.

The use of facial recognition tech in public places, however, is more contentious. Live facial recognition technology works by scanning a person’s face in real time, analysing it, converting the captured image to data and comparing it against a database of known faces to try to find a match. It has been found to be laden with bias problems, as algorithms are more likely to misidentify and over-police already marginalised groups, such as ethnic minorities.

Whether it is used by the police, in a shop or on public transport, people are generally reticent about their personal data being captured and analysed in public. A survey of 4,109 UK adults conducted in 2019 by research organisation the Ada Lovelace Institute found that more than three-quarters of people – 77 per cent – were uncomfortable with facial recognition technology being used in shops to track customers, while more than half – 55 per cent – wanted the government to impose restrictions on the police’s use of it.

People were also particularly opposed to facial recognition tech being used by private companies for commercial benefit, such as in shops to deter thieves. This week, the food store chain Southern Co-op was criticised for its use of facial recognition surveillance in 35 of its branches. The privacy campaign group Big Brother Watch has complained to the Information Commissioner’s Office (ICO), the UK’s information rights body, saying it breaches data privacy laws.

In response, Southern Co-op says it has not breached laws and only uses facial recognition in a “limited and targeted” way to deter repeat offenders in branches with “higher levels of crime” and protect its staff and customers from “unacceptable violence and abuse”. It also says it has signs in store to alert customers that surveillance is being used, and that this implies consent. Via its technology provider Facewatch, it scans customers’ faces against a database of “known offenders”, looking for shoppers who have previously been banned from stores. Staff are then alerted, so they can ask customers to leave the shop if necessary.

Nick Fisher, CEO at Facewatch, told Spotlight that it “fully complies” with the criteria for facial recognition set out by UK’s Data Protection Act 2018. “Retail clients come to Facewatch because they are experiencing significant levels of assault and abuse of their staff, as well as theft, and all other methods to prevent it have failed,” he said. “Any privacy intrusion is minimal and proportionate.”

Give a gift subscription to the New Statesman this Christmas from just £49

The ethics around using surveillance as a preventative measure are murky. Privacy campaigners argue that its use is unjustified, with risks (such as misidentification of innocent people) outweighing the benefits. Silkie Carlo, director of Big Brother Watch, said in a statement that such instances are “dangerously intrusive, privatised spying”, describing the Southern Co-op case as “Orwellian” and “highly likely to be unlawful”.

“The supermarket is adding customers to secret watch lists with no due process, meaning shoppers can be spied on, blacklisted across multiple stores, and denied food shopping despite being entirely innocent,” she said. “This is a deeply unethical and frankly chilling way for any business to behave.”

But Southern Co-op argues that its facial recognition system is “secure” and “compliant” with the General Data Protection Regulation (GDPR), and that customers’ photos are only stored if they are “identified and evidenced as an offender” – otherwise, they are deleted immediately.

The ICO told Spotlight that there is a “high bar” to meet to justify the use of facial recognition technology in public places, and the independent body is currently assessing the Southern Co-op case to ensure it complies with data protection law.

“As with any new technology, it is crucial that people’s privacy is at the heart of any decisions to deploy live facial recognition so public trust and confidence are not lost,” an ICO spokesperson said.

The legislation in this area is confusing and fragmented, however. An independent review commissioned by the Ada Lovelace Institute, published this year, concluded that regulation of biometric data gathering relies on a “patchwork” of laws that have not kept pace with technological advancement, including data protection (such as the UK GDPR), human rights, discrimination, and criminal justice laws.

The review is calling for the use of facial recognition technology to be temporarily stopped in public places until new legislation is established. This would include specific laws to cover all forms of biometric data (such as facial recognition, fingerprints, handprints, eye scans and DNA), laying out a clear process for organisations to follow before they use this technology. The Ada Lovelace Institute has also suggested a new regulatory body and a set of standards, taking into consideration human rights impact. The institute has now published a report with its full recommendations for government on reforming the governance of biometric technology like facial recognition.

Carly Kind, director of the Ada Lovelace Institute, told Spotlight that “urgent regulatory intervention” is necessary, adding that there are “serious gaps” in the UK’s laws and that the Southern Co-op case highlights “concerning uses” of new technology in this space.

“[Our research] found that the majority of the public are uncomfortable with [facial recognition’s] use outside of policing and without strong governance,” she said. “Continued inaction on the part of the government will only lead to more problematic deployments of these technologies, which don’t enjoy public trust and support, and more court cases like this one.”

[See also: “AI is invisible – that’s part of the problem,” says Wendy Hall]

Content from our partners
Building Britain’s water security
How to solve the teaching crisis
Pitching in to support grassroots football