The GamerGate “scandal” continues to rumble onwards, with furious video game players continuing to protest that their misogynistic, abusive, death threat-generating consumer protest movement is actually about “ethics in video game journalism”. Kevin Wagner at Deadspin, in his story about the latest woman to be driven from her home over death threats, is correct to note that this is a kind of rebirth of America’s post-Reagan Culture Wars – a reactionary cultural group, threatened by the suggestion that maybe the things it self-defines with shouldn’t be centred entirely on meeting only its needs and meeting its demands, is lashing out with conspiracy theories and hate.
For those who write about GamerGate – be it on a website or on social media – it’s clear that among this small, loud group of (almost entirely white and male) people, there are sub-groups which approach the issue with different tactics. There are those who make up the dark, cold star at the centre of this mess, inventing lies and generating the abuse; and then there are those orbiting on the icy edge of this system, who often sincerely believe that they are part of a consumer boycott movement, and who see no contradiction in condemning the hatred they see while putting forward the false arguments that are used to justify the abuse in the first place. Some might call them useful idiots. (And sometimes, it’s possible to feel sorry for them. Rarely, but it is.)
Tweet anything critical of the larger movement with the hashtag “#GamerGate” and, very quickly, a user will find themselves hit with a torrent of defenders arguing their case, armed with myriad videos and screenshots as evidence. Trying to engage with any of this group is infuriating – cede the silence and debate one point by demonstrating that their position is based on either misunderstanding or ignorance, and they switch to a different issue. Challenge that one, they move to another, and then another, and then they might even switch back to the first point, phrased slightly differently. It’s tedious and tiring, and wastes so much time.
The natural human instinct, when faced with something that’s a massive time waster, is to automate it. Thanks to a chatbot called Eliza, that’s what happened yesterday:
(link)
Eliza (named for the character from Pygmalion) is an example of a Twitter bot, a very primitive form of artificial intelligence plugged into a social network, and programmed to do certain things. Mostly, these bots are run by spammers – they’ll constantly be searching for tweets that mention certain words or phrases, or which use certain hashtags, and then they’ll tweet a reply out of nowhere with a link and something to tempt a user to click it. Some are jokey, though, like @RedScareBot, whose avatar is a picture of Senator Joe McCarthy, and it tweets anti-communist condemnation of users whose tweets include words like “socialism”.
While most bots are relatively easy to code and rely on little more than search-and-respond for instructions, Eliza’s a bit more complex. It (or she?) was first written by MIT computer scientist Joseph Weizenbaum in 1964, and it deliberately models psychotherapy sessions – Eliza will ask the user what’s wrong, and will interpret and respond to what they say by comparing answers to a set of scripts in a database. You can try it out for yourself. Eliza is the grandmother of every customer service online help box with a robot on the other end, and, now, the perfect foil for the robotic repetition of GamerGate talking points by its activist army, finding those using the #GamerGate hashtag and asking for them for more information:
(link)
(link)
(link)
(link)
More than a day later, this is still going on – wasting their time, and giving those who have had to constantly defend themselves some breathing space. It’s wonderful.
Alan Turing proposed that an artificial intelligence qualified as a capable of thought if a human subject, in conversation with it and another human, cannot tell them apart; the strange thing about the Eliza Twitter bot is it doesn’t come across as any more like a machine than those who keep repeating their points over and over and over, ad nauseum. It’s difficult to decide who’s failed the Turing test here.