New Times,
New Thinking.

MPs, not tech bosses, will decide what is “legal but harmful” content

Under the revised Online Safety Bill, the government will give parliament the power to define the term rather than social media firms.

By Sarah Dawood

The government will no longer let social media giants decide what constitutes “legal but harmful” content, as it revises the Online Safety Bill.

Ministers first introduced the bill to parliament in May 2021 and it has since been redrafted following recommendations from two parliamentary committees and the Law Commission. It is being reintroduced to parliament today (17 March) and ministers hope the revisions will shore up enough support among MPs for it to pass into law by the end of this year.

The bill seeks to regulate websites where users interact, such as social media platforms, by stopping the spread of illegal content like child sexual abuse and terrorism. The biggest platforms, such as Twitter, Facebook and YouTube, will also have to prevent people from being exposed to “legal but harmful” content, which is yet to be defined but could include things such as self-harm, eating disorders and misogyny. News publishers and their comment sections will be exempt.

The government says the bill strikes the balance between safety and free speech and makes the UK “the safest place in the world to be online”. Adherence will be overseen by the communications regulator Ofcom, which will have the power to fine companies up to 10 per cent of their global turnover, block access to their websites or prosecute individuals.

Not strong enough or too far?

A point of contention has been the use of “legal but harmful”, with safety campaigners previously telling Spotlight that the term was “wishy-washy” and would allow social media giants with robust legal teams to avoid retribution. They also said that certain harms defined as “legal” can cause devastating effects, such as cyberbullying.

Now, the government has confirmed that it will decide what is “legal but harmful”, and this will be set out in legislation approved by parliament, meaning decisions “are not at the whim of internet executives”.

Social media giants will be obliged to carry out “risk assessments” of what “legal but harmful” content is likely to appear on their platforms, then set out terms and conditions on how they will deal with it and enforce these. The government says this will help to protect free speech by “removing the threat of social media firms being overzealous” as they will only be obliged to remove content that breaches their terms rather than anything that offends someone.

Give a gift subscription to the New Statesman this Christmas from just £49

However, free speech campaigners argue that the changes still place too much power in the hands of tech companies, and that enshrining “legal but harmful” within legislation will be damaging to human rights. Ruth Smeeth, a former Labour MP and now CEO of free speech organisation Index on Censorship, is a victim of regular misogynistic, racist and threatening abuse.

“Cross-party MPs told the government to remove ‘legal but harmful’ because it would have a chilling effect on free speech, but the government has chosen to ignore them and us,” she says. “As a target of online abuse, I would much rather see additional funding for the police to prosecute illegal abuse, which would make a real difference rather than outsourcing decision-making to tech platforms.”

Jim Killock, executive director of digital freedom organisation Open Rights Group, adds that having government ministers inform what is “legal but harmful” is a form of “state-sanctioned censorship”, and could mean vulnerable people, such as domestic abuse victims, will be deterred from sharing their experiences. “Failure to remove it will ban Brits from doing normal things like making jokes, seeking help and engaging in healthy debate online,” he says.

Cyber-flashing will be illegal

Under the revised bill, cyber-flashing – sending unsolicited sexual images or video – will also be made illegal, with perpetrators facing up to two years in jail. Spotlight understands that the government is considering whether to make epilepsy trolling – intentionally sending flashing images to someone with epilepsy – and promoting self-harm content illegal.

Three other criminal offences have been added to the bill in recent months, including: sending threats to rape, kill or inflict violence; sending messages to cause psychological or physical harm; and deliberately spreading harmful disinformation, such as fake Covid-19 treatments or false claims about the Russia-Ukraine war.

Paid-for scam advertising, such as ads with fake celebrity endorsement, will now fall under the bill, and users will also have more options to block content from anonymous accounts. This includes being able to stop unverified accounts messaging them on social media. The government has said it will not ban anonymous accounts outright, as this could negatively impact the personal safety of activists, whistle-blowers and domestic abuse victims.

Criminal offences for tech bosses

Tech bosses and senior managers will face two years in jail or a fine if they do not comply with rules around presenting information to Ofcom. These include refusing to hand over information; destroying evidence and information; providing false information; and failing to attend meetings with the regulator, or stopping Ofcom from entering their offices.

These will come into force two months after the bill becomes law rather than the previously stated two years, making the possibility of prison more immediate. Websites will also now have a duty to report any cases they detect of child sexual exploitation and abuse to the National Crime Agency, replacing the existing “voluntary” reporting regime in the UK.

Child safety campaigners have long been calling for harsher criminal prosecutions for company directors. Andy Burrows, head of child safety online policy at children’s charity the NSPCC, previously told Spotlight he wants to see tech bosses liable if they fail to stop children being harmed, not just if they fail to provide information to Ofcom, and said the sanctions offered “bark but little bite”.

Concerns have been raised about how Ofcom will cope with its new responsibilities – speaking on ITV’s This Morning, the Culture Secretary, Nadine Dorries, the minister responsible for the legislation, said the government has put “huge amounts of money and resources” into the organisation to help it perform its duties.

Others have expressed concerns around free speech – Killock of Open Rights Group has likened taking down websites or imprisoning social media executives to Vladimir Putin’s ban on social media. Dorries said that while blocking platforms was the “ultimate sanction”, it is a last resort: “We don’t want to get there, because these platforms are also a force for good,” she said.

Content from our partners
Building Britain’s water security
How to solve the teaching crisis
Pitching in to support grassroots football