New Times,
New Thinking.

Labour: Social media bosses should be held criminally liable for harmful content

The party has called for the scope of the Online Safety Bill to be expanded as it returns to parliament.

By Harry Clarke-Ezzidio

Labour has called for the Online Safety Bill to “go further” in its commitment to holding the bosses of social media companies to account for harmful content.

This week the bill, which aims to make the UK “the safest place in the world to be online while defending free expression”, returned to parliament after a five-month hiatus. MPs scrutinised a number of new amendments, most notably the dropping of “legal but harmful” aspect of the bill, which would require Big Tech platforms to stop the proliferation of dangerous content, including posts containing misogyny, online bullying and eating disorders.

As the bill stands, companies and their directors will only face punishments and fines if they fail to cooperate with the regulator Ofcom’s investigations into a platform’s failure to protect users. Fines can total up to £18m or 10 per cent of a company’s global turnover, whichever is highest.

But Labour wants more, and for the bosses of Big Tech companies to be held personally liable if their platform fails to protect users on their platforms more widely. In a Commons debate on Monday (5 December) Alex Davies-Jones, shadow digital, culture, media and sport minister, told MPs that the bill’s proposed fines “can be a drop in the ocean” for the biggest platforms “that are often at the centre of the debate around online harm”.

“We’ve seen that with the likes of Google, Elon Musk and Mark Zuckerberg… money doesn’t matter to them. What matters to them is actually being held to account,” the MP for Pontypridd said. “It is vital that people are held responsible for issues that happen on their platforms. And we feel that criminal liability will do that.

Select and enter your email address Your weekly guide to the best writing on ideas, politics, books and culture every Saturday. The best way to sign up for The Saturday Read is via saturdayread.substack.com The New Statesman's quick and essential guide to the news and politics of the day. The best way to sign up for Morning Call is via morningcall.substack.com
Visit our privacy Policy for more information about our services, how Progressive Media Investments may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications.
THANK YOU

“The bill in its current form only places individuals at the top of companies personally liable when a platform fails to supply information to Ofcom, so it misses the point entirely. Directors must be held personally liable when safety duties are breached. It really is quite simple.”

An amendment to the bill, tabled by Paul Scully, and under secretary in the department, allowing for top executives within companies to be held liable for their platforms not adequately protecting users, will be considered with other proposals – including what to do around “legal but harmful” content – as the government plans to recommit certain clauses to a Public Bill Committee. Any subsequent changes will then be further debated in Parliament.

[See also: What are the latest updates to the Online Safety Bill?]

Most of the debate is likely to centre around what the government will decide to do regarding the “Harmful Communications Offence” part of the bill – the part often referred to as the “legal but harmful” rule.

It would require “the largest, highest-risk platforms” to prevent all users being exposed to harmful content related to, for example, self-harm and eating disorders. It received heavy resistance from a number of pro-free speech Tory MPs over censorship concerns. Kemi Badenoch, the International Trade Secretary, previously said that the bill was “legislating for hurt feelings”.

“Legal but harmful” has been replaced with what Michelle Donelan, the Culture Secretary, is calling a “triple shield”, where platforms will be required to: remove illegal content (such as child sexual exploitation, and terrorist content); material that violates a site’s terms and conditions; and to provide users controls to filter out certain types of content to be specified by the bill. The bill still requires platforms to protect children from viewing content that risks causing “significant harm”.

Parliamentarians have sought to tighten up the rules on social media moderation in light of a series of deaths linked to harmful online content. Ian Russell, whose daughter, Molly, aged 14, took her own life after being exposed to the “negative effects of online content”, a coroner concluded, recently expressed his disappointment at the bill being “watered down”.

The bill has been in a state of flux since it was first introduced last year, and doubts over whether it will eventually come into law had heightened during its long hiatus from parliament. But in her first session with the DCMS committee since taking post, Donelan promised that the bill will be passed in this Parliament.

“That is something that I have personally discussed with the Prime Minister and personally discussed with the chip whip. So I’m comfortable in giving a full fledged assurance on this because it matters too much,” Donelan said in parliament on Tuesday. “And I too, met with those bereaved families, and I’ve met with some of them before, including Ian Russell, and I can completely understand why they’re at the end of their tether and that they are concerned as to whether this bill will go through.

“But I want to give them… full assurance that it certainly will, because as I say it matters too much – and we’ve got to try and stop other families going through that horrendous pain that they went through.”

[See also: Data is the crux of your organisation’s security]

Content from our partners
The Circular Economy: Green growth, jobs and resilience
Water security: is it a government priority?
Defend, deter, protect: the critical capabilities we rely on