New Times,
New Thinking.

Why Ofcom needs clear powers to audit Big Tech’s algorithms

Inspection of algorithms in social media is an essential, but also frequently overlooked, part of the Online Safety Bill.  

By Jenny Brennan

The Online Safety Bill is a landmark piece of national legislation – one that will attempt to lay down in law a set of rules about how online platforms should address online harms. This is the first major step in the UK to regulate Big Tech platforms. It is also something that the government has been grappling with for years – how do you prevent, for example, online harassment while also protecting free speech? 

This week, the joint parliamentary committee on the Online Safety Bill returned its recommendations, having taken evidence from civil society and academia, and media names such as the Facebook whistleblower Frances Haugen, Wikipedia founder Jimmy Wales and footballer Rio Ferdinand. Their recommendations will inform the government as it prepares the final draft of the bill to be taken to parliament – and shape public discussions around it.

Ofcom’s interim chair, Maggie Carver, recently described online safety as a game of whack-a-mole with “too many moles to whack”. There are many legitimate views on what the bill should focus on – from concerns about online harassment, to child safety, to freedom of speech – and most recently the call from rights groups for it to include protections for women and girls.

But amid the headline-grabbing concerns associated with tackling online harms, one thing we should agree on is that the inspection of algorithms in social media is an essential, but also frequently overlooked, part of the bill. 

Whatever a particular organisation or policymaker’s priorities may be in relation to tackling online harms, none of them can be achieved without Ofcom – the regulator given responsibility for online harms – having the appropriate powers to look under the bonnet of the algorithms in social media and search platforms. 

Christopher Wylie, a former employee of Cambridge Analytica, has laid out the problem clearly: “We fail to understand that these are products of engineering. We do not apply to tech companies the existing regulatory principles that apply in other fields of engineering. We do not require safety testing of algorithms.” In her testimony to US Congress, Haugen compared the current regulatory environment for social media platforms to “the Department for Transportation regulating cars by watching them drive down the highway”.

Regulatory inspection (also known as “auditing”) of social media will involve being able to test the algorithms and processes that social media platforms use to recommend and moderate content – what are they prioritising, hiding or flagging in your news feed? And are they doing what they claim? Independent auditing work by journalists at The Markup has identified that Facebook was still recommending anti-vaccine groups to users, even after it said it had stopped.

Give a gift subscription to the New Statesman this Christmas from just £49

Just last week, researchers used a “sock puppet audit” (creating fake accounts – sock puppets – with particular profiles to observe what the algorithm recommends to them) to identify that Instagram recommended accounts full of disturbing images of underweight women to users who showed an interest in getting thin.

And so it is important to note that the joint committee’s recommendations explicitly call for the bill to ensure that Ofcom has these auditing powers. Without those powers, Ofcom won’t be sure what the platforms are doing, and whether they’re in line with the law.

The Ada Lovelace Institute’s submission to the committee emphasised this need for Ofcom to be given the power to perform technical audits as part of inspections. When the Online Safety Bill does go to parliament in the new year, we expect to see inspection and auditing included. We also hope that it will recognise the need for supporting Ofcom to use these powers by ensuring that it is equipped to conduct these complex, technical inspections. These will require a mix of skills and tools – from interviewing and reviewing documentation, to looking at code and developing software that can recreate and analyse user experiences on social media.

[See also: NYU’s researchers are the latest victims of Big Tech’s war on scrutiny]

However, a strong inspection ecosystem requires more than a regulator inspecting alone. Journalists, academics and civil society organisations all play a role in raising alarm bells about the harmful practices of social media platforms – sometimes as tech platforms try to block them. These independent actors need support, responsible research access and reassurance they won’t be at risk of breaking the law in order to hold technology firms accountable for the harms their services may cause or enable.

All of this will matter for more than just online safety – giving regulators information-gathering powers suited to modern technology, and equipping them to use them, is going to be crucial across domains – from protecting against discriminatory hiring platforms to preventing algorithmic price fixing. The majority of the UK public think the tech sector is regulated too little, and many look to the regulators as having the most responsibility for directing the impacts of technology on people and society. 

If we want people to have power over what they see on the internet, and people to be kept safe from harm, we need regulators to be able to uncover the facts about the algorithms controlling social media platforms.

[See also: What’s illegal online must also be illegal offline, says Damian Collins]

The Ada Lovelace Institute is an independent research institute with a mission to ensure data and AI work for people and society.

Content from our partners
Building Britain’s water security
How to solve the teaching crisis
Pitching in to support grassroots football

Topics in this article : , ,