New Times,
New Thinking.

Social media companies can no longer leave children to their own devices

Big Tech must share user data so we can truly understand the risks of phone use for young people.

By Bernadka Dubicka

As a consultant child and adolescent psychiatrist, I am seeing more and more children who need specialist support as a result of using social media. In the most extreme cases, children and young people are starving themselves in the pursuit of thinness or presenting in crisis with suicidal ideation as a result of being abused online. 

We at the Royal College of Psychiatrists marked Safer Internet Day this week (6 February), so it’s an apt time to discuss the risks of being online for young people. While we have known for some time that social media can negatively impact children’s mental health, the data we have to prove this is shockingly lacking. What we need is full transparency from social media and gaming companies – but access is still very limited. 

This lack of data also makes it difficult to truly understand the impacts of prolonged screen time on children, and consequently navigate the support they need. Even as we become more aware of the dangers of harmful content, we still don’t know enough about harms that result from the way young people use the internet.  

Social media companies make profits by keeping us on their site for as long as possible. They are designed to promote continued engagement. Algorithms manipulate the feed of content to keep us glued to our screens, and “like” buttons reward us for posting even more content. Many of us will have experienced how a brief check of X, formerly known as Twitter, or Instagram can lead to hours of endless scrolling.  

What may seem harmless at first could lead to overuse and even addictive-type behaviours among users. Algorithms can be particularly problematic. Just based on a couple of clicks, they start to learn our behaviours and interests and then push material we want to see – or material that we did not ask to see, which cannot then be unseen – and become very difficult to control. Its persuasive design deliberately reinforces digital habits; making us subconsciously reach for a device, refresh pages and profiles to check for new content.   

Young minds are still developing, and are continually being shaped by their environment. Therefore, the effects of overuse, addictive potential and impact on their mental health and development is extremely worrying.

While children are banned from gambling, they can often still use “loot boxes” – a common feature of video games that allows a player to purchase with real money a box that contains a random in-game item. This could range from a new outfit for their character to a weapon that makes completing the game easier. As the player purchases the loot box unaware whether they will receive a high-value or low-value item, there is a worrying parallel to gambling. However, despite these similarities, legislation to ban loot boxes has been held up because the evidence behind the harms being caused is “limited”, despite being regulated in some countries in Europe.

Give a gift subscription to the New Statesman this Christmas from just £49

Social media companies keep data on how we use their sites, but they are reluctant to fully share it with independent researchers. They collect this data for their own knowledge and, ultimately, financial gain, but there is no commercial interest in sharing this information. This data would help us better understand the links between social media use and children’s mental health, and the mental health of adults as well.

The available evidence paints an alarming picture. Research from the Royal College of Psychiatrists in 2020 demonstrates links between the time young people spend using digital technology and their weight, sleep, mood and even thoughts of suicide and self-harm. Since our statement was published, there has been more and more research on the potential harms of social media, excessive gaming and loot boxes. 

It is clear that social media companies need to start taking a greater responsibility for their products, and a key step is helping researchers understand how their sites are affecting young people.

The recent Online Safety Act tasks Ofcom with creating a report to look at how greater access to this information might be obtained. However, I fear this will be a long, drawn-out process, and it will take many years to achieve any meaningful change. In the meantime, social media continues to encroach on every aspect of young people’s lives, and, with so little understanding of its effects, we are quite literally leaving children to their own devices.

That is why we need to urgently devise a data-sharing framework that incorporates user consent. The likes of X, Meta and TikTok must be compelled to share representative and large-scale user-level data with researchers, who must also remain independent of these companies, and not be influenced by the pressure of profit margins. This is the only way that we will be able to truly understand the risks of social media, as well as the benefits.

We cannot allow social media companies to treat children like guinea pigs or the collateral damage for their bottom line.

[See also: The danger of deepfakes goes far beyond Taylor Swift]

Content from our partners
Building Britain’s water security
How to solve the teaching crisis
Pitching in to support grassroots football