Nicci Astin was scrolling through a Facebook group full of tips about quitting smoking when she spotted something strange. There, among the suggestions and anecdotes, was an image of a child being abused, posted by a male Facebook user.
“I went to look at his profile,” she tells me now, years later, “because I thought he was probably just another troll.” But he wasn’t. The man had more, similar images on his profile. When she clicked through to profiles on his “friends” list, their pages were filled with the same.
This was how Astin first stumbled into the world of paedophiles and child abusers who swap images, connect, and even groom children on social networks. She isn’t alone: other ordinary people have been sucked in, too. As Katie Ivall, who has searched for paedophiles online since her own daughter was targeted, told the BBC last week: “This is the dark side of the internet”.
Astin’s first port of call was the police, who, she says, “told me: ‘It’s on Facebook, what do you want us to do about it?’” She spoke to others who had found similar material and they formed a group with the aim of collecting information about the people posting the pictures, then passing it on to the police. Several in the group posed as 13 or 14 year old girls and spoke to men until they revealed their phone number, or asked to arrange a place to meet.
This may seem extreme, but it was prompted by the legal and technological tangle faced by those hoping to report criminals who operate online. Facebook’s content policy states that “solicitation of sexual material, any sexual content involving minors, threats to share intimate images and offers of sexual services” are banned from the site. Yet Astin says that known active paedophiles on the site would simply pop up with new accounts, days after their original accounts were deleted.
Oisin Sweeney, who was also a member of the group for a short time, writes in his book about dark internet subcultures that a man named Paolo Ghelardini was a “top priority” for the group. When police arrested him, they found 9,500 photos and 1,000 images of children in his home. He had cycled through at least 19 Facebook accounts between January 2010 and May 2011.
The group also claims it had a more direct effect on the arrest of several individuals who were sharing pictures of children, some of whom were actively abusing children alongside their online activities. This was through passing on the information to the Internet Watch Foundation (IWF), an organisation that works directly with Facebook to keep children safe online. Astin’s group has also tipped off the Child Exploitation and Online Protection Agency (CEOP), which is a branch of the National Crime Agency.
Astin says that several members of the group posed as a teenager online to speak with John Huitema, a Dutch man based in Glasgow, and arranged a meeting. Sweeney then passed on the details to the CEOP. (The CEOP says it welcomes tip-offs from civilians, but cannot comment on individual cases.)
When police arrested him, Huitema was in possession of 7,333 illegal images. As the group suspected, he had also been abusing a two-year-old girl and posting photos of the abuse online. He was sentenced to four and a half years in prison in July 2012, and will be deported to Holland on his release.
The CEOP mostly receives tip-offs of social media paedophilia not from individuals like Sweeney and Astin, but from the National Centre for Missing and Exploited Children (NCMEC) in the US, since under US law Facebook and other sites must pass on details of child abuse images. In 2010, CEOP received 400 referrals a month from the NCMEC. Now, it receives around 1,800. This could be partially down to improved reporting systems, but a more than four-fold rise implies that the illegal activities themselves are on the rise.
Ordinary people who believe it’s their duty to hunt down paedophiles have become a bit of a cliché, thanks in part to TV shows like NBC’s To Catch a Predator. Yet it’s clear why people like Astin, who has also campaigned on child abuse and, more recently, on the case of Daniel Pelka, who starved to death at the hands of his parents, feel trapped by the situation: aware of an account, reporting it, then watching as it doesn’t disappear, or an identical account pops up within days. It’s impossible to look away, especially when its possible that children are being actively abused by those posting images.
***
The IWF works directly with Facebook to try to combat abuse on the site, and said in a statement: “As an IWF member, Facebook has zero tolerance for child sexual abuse… Facebook is one of the leaders in the field who uses new technology to combat the problem.”
A spokesperson for Facebook told me that this technology includes PhotoDNA, which cross-references images uploaded to the site with a known registry of child abuse images. This can stop the proliferation of existing images across the web, but of course can’t recognise new ones. Facebook also has a “single point of contact” with law enforcement, which allows the site to pass on illegal content directly or help law enforcement with existing investigations.
Both Sweeney and Astin reported instances where they marked images involving children as offensive, then received an automated message informing them that the images “did not breach Facebook’s community guidelines”. In last week’s BBC investigation into paeophiles sharing non-sexualised images with sexual comments online, the reporter experienced the same thing.
Part of the problem is that context can strongly affect an image’s meaning: the BBC piece describes a picture of a “girl of 10 or 11 in a vest”, which under most circumstances would not require removal. This is a challenge for Facebook – it would take vast resources to allow humans to search through all these individual posts and make a judgement call. However, a Facebook spokesperson tells me that the site is conducting a thorough investigation into this issue, and plans to evolve its approach to fighting child abuse on the site.
Astin and Sweeney both believe that secret groups are a particular problem. In contrast to Facebook’s Open and Closed groups, they aren’t searchable, and you must be invited to join in order to see them or their contents. Astin, Sweeney and the BBC all found that paedophiles were swapping images via this type of group. “The worst things that I’ve ever seen in my life have been within secret groups,” Astin tells me.
The groups are identifiable to users via keywords that have developed in the community. In Sweeney’s book, he reports that there is an entire “paedophile subculture that has its own words and codes, its own symbols, and its own heroes”. These images come from a FBI document leaked through Wikileaks:
“PTHC” or “Pre-teen hard core” was a common acronym, which continued to be used after Facebook attempted to block it. Astin said users simply “put dots between the letters” to get round the ban. A Facebook spokesperson told me that the site works with agencies to block users who search for known child exploitative terms, and updates the list of terms regularly.
***
In 2011, the police asked Astin and Sweeney’s group to stop. In trying to catch those sharing the images, the group were accessing illegal material and impersonating children – both of which police would, understandably, rather do in-house. In the same year, a documentary made by Mark Williams-Thomas (the ex-policeman who helped expose Jimmy Savile) revealed that police in anti-child abuse units are now posing as children themselves in order to catch abusers online.
Yet, in agreement with figures provided by the CEOP, Astin says that the problem seems to have worsened since 2011. “I obviously can’t pose as fake accounts anymore, but I can look with my own account, and as soon as you find one person you find another and another. There’s thousands on there, it’s absolutely horrific.”
So what could help? The Metropolitan Police has stated that “the distribution network for child abuse imagery must be closed if the production of material which sexually exploits children is to be effectively controlled”. This is a worthy aim, but it gets far more complicated when you realise that in a larger sense, this refers to the networks we all use: Facebook, Twitter, even Instagram.
Astin says she believes Facebook should take “more responsibility” about what individuals post on their site. One way to do this is to heavily enforce a real name policy, where people’s accounts are linked to their real identities, but this would arguably just push criminals to other sites without these safeguards. The only way forward is for sites to work closely with police to catch perpetrators, rather than simply delete their online identities.
Thanks to changing policies within social networks, this seems be happening now on a scale never seen before, which perhaps means the civilian paedophile hunters can leave law enforcement to do its job. Part of the problem with reporting on or finding these accounts it that under UK law, viewing them on a computer screen counts as “reproducing” or “making” child porn images. Engaging directly with the paedophiles themselves is equally risky from a legal point of view.
I ask Astin if this bothered her at all while she was still hunting paedophiles online. “We’d give all the fake accounts and passwords to the police so they understand for themselves what we were doing.” But if they didn’t? “I’d rather get in trouble than have a child being abused. There are people sharing pictures, but then there are people asking for new pictures. You know a child is being abused there and then. I’d spend my day in court if it meant I could stop that from happening.”
Now listen to Barbara Speed discuss the subject of online abuse with Helen Lewis and Stephen Bush, on the New Statesman podcast: