New Times,
New Thinking.

  1. Comment
11 October 2022

The Molly Russell verdict proves it – social media must be regulated

A ruling has found social media companies played a part in the death of a British teenager. Intervention must be dramatic and swift.

By Sarah Manavis

There are many reasons why I’m relieved not to be a teenager anymore – but I’m especially grateful not to be a teenager in 2022. When I was a young adult, social media wasn’t yet omnipresent in my life: I finished school just as Instagram was becoming popular; I was barely allowed on Facebook; I used Twitter largely to retweet jokes; mercifully, TikTok didn’t exist. The few platforms I used were comparatively basic and my usage of them was limited. I narrowly evaded a childhood immersed in social media.

The digital landscape for children today is a horror show. Evidence suggests there are real harms for teens that spend too much time online, from isolation to negative body image, triggering large-scale depression and anxiety. There are also concerns and ongoing investigations surrounding child predators on social media apps, which becomes even more concerning when we consider how easy it is for primary school-aged children to evade platform age restrictions. And though kids’ online safety has arguably never been spoken about more than in the past year – from US congressional hearings to proposed regulatory laws in several countries – platforms have so far managed to shirk accusations of direct responsibility for the danger many believe they put their young users in.

But a landmark ruling in the UK has found social media companies played a part in the death of a British teenager, Molly Russell, who died from an act of self-harm in November 2017, aged just 14. The court case investigated the role Instagram and Pinterest, specifically, played in Russell’s death. The senior coroner concluded that Russell died as a result of “suffering from depression and the negative effects of online content”. Following the ruling, Russell’s father said social media platforms sent his daughter on a “demented trail of life-sucking content” and accused them of “monetising misery”. The verdict comes at a time when the rate of litigation against social media platforms appears to be accelerating. The most prominent lawsuit concerns the blackout challenge, a purported TikTok trend. The suit claims two children died when attempting the challenge after it was promoted to them on the app, and cites several others who died doing the same thing.

The details of Molly Russell’s case are grim and unsettling. Speaking to the Guardian’s Today in Focus podcast, Dan Milmo, the paper’s global technology editor, said that of the roughly 20,000 Instagram posts Russell had viewed, shared and liked in the final six months of her life, 2,100 were related to anxiety, depression, self-harm or suicide – and that Instagram had also pushed 34 specific “depressive accounts” directly to Russell. (Pinterest sent her an email suggesting “depression pins you might like”.) In the same episode, Russell’s father explained that when members of his family reported content Russell had seen, Instagram told them the content wasn’t in breach of their guidelines and would not be taken down. The senior coroner also suggested that some of the online content sought to discourage users from seeking medical attention.

The ruling is a major turning point in how Silicon Valley is scrutinised – there is now legal evidence that social media likely contributed to a child’s death, or as the senior coroner put it, “in a more than minimal way”. But while these details are chilling and depressing, and should result in a dramatic change in how these companies operate, they are unlikely to come as a surprise to anyone who understands how social media recommendation algorithms work.

Select and enter your email address Your weekly guide to the best writing on ideas, politics, books and culture every Saturday. The best way to sign up for The Saturday Read is via saturdayread.substack.com The New Statesman's quick and essential guide to the news and politics of the day. The best way to sign up for Morning Call is via morningcall.substack.com
Visit our privacy Policy for more information about our services, how Progressive Media Investments may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications.
THANK YOU

Social media algorithms are designed to give users more of what they are already consuming in order to keep them interested and logged on to a platform in question. This is, ultimately, how apps become addictive and, therefore, remain profitable. TikTok’s algorithm is notoriously adept at feeding users highly tailored content at a rapid pace – and other platforms have tried to replicate it.

Most of us will recognise the effects of the algorithm. Let’s say you have looked at a number of puppy videos or pictures of holiday destinations: your social media feeds begins to show you more of that content in a bid to keep your attention. However, these algorithms are currently undiscerning, so if a user is looking at something harmful – for example, content that recommends restrictive diets or unsafe fitness tips – the algorithm may continue to push more of that content – suggesting the user follow, say, emaciated fitness influencers or accounts that share riskily low-calorie meal plans – despite the obviously negative impact it will have on the person consuming it. This is a fundamental functionality of these platforms that social media companies are unlikely to want to sacrifice – which is at odds with preventing future cases such as Russell’s from happening again.

Teaching algorithms to filter out this dangerous content, or hiring a large team of moderators to track which users may be overexposed to it, are potential ways to tackle this problem. But we shouldn’t just leave social media companies to self-moderate. Around the world, government regulation of these platforms is nearly non-existent. In the UK, the tragedy of the Russell verdict is made even greater by the fact that the Online Safety Bill – a supposedly crucial intervention in online harms, but which has received widespread criticism for its light touch – has experienced an unending set of delays (Russell’s father was told that it won’t be seen until Christmas at best). The bill grows more and more out of date with each passing day.

Perhaps no generation will be failed by the internet as badly as today’s young people. We currently sit at a depressing intersection, in which the greatest amount of digital exposure is meeting what will inevitably be the fewest number of laws regulating the internet. The Russell verdict demands immediate change. If intervention is not dramatic and swift, there may be more, equally preventable, tragedies to come.

[See also: Time spent on social media isn’t necessarily bad for kids – it’s how they use their screen time]

Content from our partners
The Circular Economy: Green growth, jobs and resilience
Water security: is it a government priority?
Defend, deter, protect: the critical capabilities we rely on

Topics in this article : , , ,