New Times,
New Thinking.

  1. Science & Tech
7 March 2017

Why did Facebook report the BBC to the police for pointing out it hosts sexualised images of children?

The story behind the row between the broadcaster and the social network.

By Amelia Tait

This morning the BBC broke a story about Facebook’s failure to remove “sexualised images of children” from the platform – an investigation sparked by the New Statesman’s coverage of the issue last year. The ensuing headlines, however, have focused on another element of the story: the fact that Facebook allegedly asked BBC journalists to send over examples of the offensive images, and then promptly reported the journalists to the police.

“It is against the law for anyone to distribute images of child exploitation,” Facebook said in a statement.

The BBC’s investigation criticises the social network for failing to remove “more than 80 per cent” of photos that the BBC brought to its attention via its report button. The BBC claim these items included: “images of under-16s in highly sexualised poses, with obscene comments posted beside them”, “groups with names such as ‘hot xxxx schoolgirls’ containing stolen images of real children”, and “an image that appeared to be a still from a video of child abuse, with a request below it to share ‘child pornography’.”

According to the BBC, in order to secure an interview with Facebook’s director of policy, it was asked to provide examples of the offensive material. When the BBC sent these over, it was reported to the UK’s National Crime Agency.

“When the BBC sent us such images we followed our industry’s standard practice and reported them to Ceop [Child Exploitation & Online Protection Centre],” said Facebook.

What’s missing?

As it stands, it is unclear why Facebook requested the images and why the BBC complied in sending them over – as under the Protection of Children Act it is illegal to distribute indecent images of children.

While Facebook acted lawfully and properly in reporting the images, it isn’t clear why it asked them to be sent over in the first place, and why – after receiving the offending content – it cancelled an interview with the BBC.

Give a gift subscription to the New Statesman this Christmas from just £49

So who’s in the wrong?

The chair of the culture committee, Damian Collins, has branded Facebook’s actions “extraordinary”.

“I think it raises the question of how can users make effective complaints to Facebook about content that is disturbing, shouldn’t be on the site, and have confidence that that will be acted upon,” he said.

The BBC also their investigation to Anne Longfield, the children’s commissioner for England, who said: “The moderation clearly isn’t being effective.”

A former Facebook executive, Elizabeth Linder, however told BBC Radio 4 that people would be “quite worried if Facebook turned into an active, police-like state.” When questioned about why Facebook groups with offensive names (such as “hot xxxx schoolgirls”) weren’t immediately filtered out, she said:  “It’s a fine balance there between, you know, freedom of speech – a lot of the names that actually seem potentially dangerous are, you know, are not.”

But isn’t Facebook normally fine with censorship?

What makes this story stranger is that Facebook is normally in the news for being over-zealous with its policies. Six days ago, it blocked an art broker from uploading a nude oil painting of two women and last September, it removed the iconic “Napalm girl” Vietnam war photo for featuring a “nude child” (the photo was later reinstated).

Facebook’s actions also don’t sit well with claims the social network made last week that it has artificial intelligence which can identify suicidal users and terrorists.  

In its statement to the BBC, Facebook said it has reviewed the content referred to it and has now “removed all items that were illegal or against our standards.”

“This content is no longer on our platform,” the statement reads. “We take this matter extremely seriously and we continue to improve our reporting and take-down measures.” 

Content from our partners
How to solve the teaching crisis
Pitching in to support grassroots football
Putting citizen experience at the heart of AI-driven public services