Trolls are white men alone in their bedrooms, right? We all know that. We also know that women are generally targeted, harassed and trolled online far more than men.
But a new study by Demos, released to coincide with the launch of Yvette Cooper’s Recl@im the Internet campaign against online misogyny, suggests otherwise. Based on 10,000 tweets sent by 6,500 users in the UK over three weeks, an algorithm judged that around 50 per cent of those sending aggressive tweets containing the words “slut” or “whore” were, in fact, women.
This was met with incredulity by women who had experienced abuse online themselves, mainly from men, and had doubts about whether an algorithm could really identify what was abuse and what wasn’t – or whether users were male and female.
I really think Demos is brilliant and interesting but this research barely scratches the surface of online misogyny https://t.co/OD69pitzIC
— Sophie Warnes (@SophieWarnes) May 26, 2016
This is misleading. Yes, women used those words just as much, but it’s not women sending rape threats, is it? https://t.co/DbEV2vr18r
— Holly Brockwell (@holly) May 26, 2016
I spoke to Alex Krasodomski-Jones, a researcher in the Centre for the Analysis of Social Media at Demos who worked on the study, to get to the bottom of whether women really are sending as many misogynistic tweets as men.
Tweets containing “Slut” and “Whore” don’t represent all online misogyny
Krasodomski-Jones says that these words were chosen as a result of a 2014 Demos study into misogyny online, which found that slut and whore were “by far the most commonly used” words in these types of posts. As a result, the researchers analysed tweets containing them to “scratch the surface” of online misogynistic abuse.
“This was a very short piece of work for Yvette Cooper,” Krasodomski-Jones says, “And is an incredibly limited view of misogyny online. This is by no means a complete picture.”
The algorithms were almost as good at identifying misogynistic abuse as humans are
The researchers used four algorithms for the research. One identified (and discounted) tweets to do with porn, while another tried to discount tweets where the terms were self-directed: “I’m such a slut”, for example.
Then, the researchers themselves marked up around 300 tweets by gender and aggressiveness, and then ran the algorithms over them as a kind of “exam”, Krasodomski-Jones tells me. Compared to human judgement, the algorithm for finding aggressive tweets was 82 per cent effective, while the one for gender was 85 per cent effective.
There’s an important caveat here, though: many victims of online abuse say they suspect many of the “women” abusing them are men with fake accounts, which may well fool both humans and algorithms.
Context is key
Knowledge of two major Twitter events that took place during the 23 April – 15 May window of the study thows a lot of light on the unepected gender split.
Krasodomski-Jones says that reactions to Azealia Banks’ abuse of Zayn Malik was the “top hit” in the study, because “all these girls who love Zayn jumped onto Twitter to call her a slut and a whore”. The same happened when Beyonce fans believed they’d identified “Becky”, the woman Jay-Z cheats with in the artist’s Lemonade album. Krasodomski-Jones describes this as a form of “protectivenes” on the part of fans.
What this suggests it that women are abusing each other online, especially around key events like this, and in quite a different way to the way men abuse women. Our pop culture writer Anna Leszkiewicz has written in the past about small groups of apparently female fans who troll and harass each other, and stars, online – accounts like this may be contributing to the statistics.
Whether you identify this as misogyny specifically is another question: it’s certainly misogynistic language, but the study may shed more light on the rise of slutshaming insults than on the way hatred of women actually manifests online. Rape and death threats from men are at the sharper point of misogyny online, and this study does little to shed light on this much scarier (and illegal) form of abuse.
The takeaway? We are all potential trolls
Krasodomski-Jones is keen to highlight that he knows the “cabal of angry white men hidden behind a computer”, the troll stereotype, does exist, but says that this study didn’t manage to zone in on this group in particular. What it did indicate was that the many people experiencing this specific type of abuse online don’t quite fit the media narrative of, say, men abusing female journalists or prominent feminists.
If Cooper’s Recl@im the Internet campaign is hoping to tackle all misogyntic abuse, as well as harshly punishing its illegal forms, this isn’t a bad takeaway: as with rapists, it’s unhelpful to see all “trolls” as a specific, evil group who are very different from “ordinary” people. As the study shows, far more ordinary people than we realise, of all genders, are willing to abuse others online with little provocation. If we figure out a way to combat this effect, then perhaps we’d be able to reduce all types of misogyny online.