New Times,
New Thinking.

  1. Science & Tech
31 January 2017updated 01 Feb 2017 3:21pm

Why we should stop using the phrase “lone wolf“

It is time the definition of "online radicalisation" was broadened to include the indoctrination of lonely, young white men. 

By Amelia Tait

Within a day of the fatal shooting of six people at a Quebec City mosque, Canadian public safety minister Ralph Goodale had described the suspect, 27-year-old Alexandre Bissonnette, as a “lone wolf”.

Although the term ostensibly refers to an individual acting without help from a group, it is now often used to downplay acts of terrorism committed by white, non-Muslim perpetrators. Anders Breivik, the Norwegian right-wing white supremacist who killed 77 people in 2011, was consistently referred to as a “lone wolf” in the media. Mohamed Lahouaiej-Bouhlel, the Tunisian who killed 86 people when he drove a truck into a crowd in Nice last July, was not.

Used directly after an attack, the phrase “lone wolf” is also laden with other meanings. It means: this man should not be treated as a terrorist. It means: we have found and isolated the problem. Most importantly it means: you are safe. He was alone. There is no one else.

More often than not, however, this is simply untrue.

Claims that Bissonnette, who has not entered a plea, was a right-wing “troll” are now gathering media attention. Francois Deschamps, the owner of a “Welcome to Refugees” Facebook page, claimed the student used to leave hateful posts on the site. Similarly, Breivik’s use of the internet has been studied extensively, and there are suggestions that he was radicalised online. The disenfranchised young men who fit the profile of terrorist shooters worldwide are no longer as isolated as they used to be. Thanks to internet radicalisation, lone wolves have found a pack.

“When used in common parlance ‘online radicalisation’ tends to focus on Islamic communities,” says  Alex Krasodomski-Jones, a researcher of the Centre for the Analysis of Social Media at the think tank Demos. “But absolutely right-wing communities have existed on the internet almost since its conception. They operate extremely effectively online and I don’t think you would be hard-pressed to describe them as places where people could radicalise.”

Last November, the writer and TED speaker Siyanda Mohutsiwa pointed out – in a series of viral tweets – that we need to reconsider what we think of as “online radicalisation”.

Give a gift subscription to the New Statesman this Christmas from just £49

For many years, lonely young men have vented their romantic and sexual frustrations online, which has lead them to form extreme anti-feminist and misogynistic groups such as Reddit’s r/TheRedPill and r/Incels (which stands for “involuntarily celibate”). Similarly, white supremacist websites such as Stormfront have long been a home for extreme right-wing views. Within the last year, the two communities have bled into each other, after finding a common hero: the 45th President of the United States, Donald Trump.

If you find it hard to comprehend how someone can go online with one extreme view and log off with another, an ex-user of the notoriously politically incorrect forum 4Chan describes it best. Under the heading “A warning for young isolated men like myself”, the man known only by his Reddit username, 500ooo, writes:  

“I used to browse 4chan (for about 4 years) back when I was a shutin at college and at home… I was definitely addicted to 4chan … I browsed some of the worst boards there, too … despite conscious efforts, a lot of the garbage did seep its way into my mind. Now, I never believed them that much, but they would still be there, likely influencing me.

“Given enough time in those places, and a lack of sensible human beings to socialize with, one would experience the same thing I did. Slowly your views on things, like women, race, whatever, will start to change.”

The user goes on to claim that white supremacy websites actively “game” subreddits for “loser/virgin/lonely/angry” young men. “The very idea of some naive, confused kid being brainwashed by these people brings my blood to a boil,” he writes. “Being in shitty communities stunts critical thinking and any kind of growth, creating a negative feedback loop.”

This is exactly what Cass Sunstein, the author of #Republic: Divided Democracy in the Age of Social Media, thinks is happening. “A cybercascade occurs when someone says something, and then someone else repeats it, and then someone else does the same, and pretty soon we have a cascade effect, when lots of people think something is true, simply because so many people seem to think it is true – and it might be false,” he says. “It happens every month and probably every week.”

Sunstein’s comments illustrate that online radicalisation is a two-way street. Those who become extremists on little-known forums may have been pushed there first by traditional social media. The brevity and urgency of sites such as Twitter mean that all our views can become extreme caricatures, forcing others to define themselves in opposition. Sunstein argues this has fuelled political polarisation in the US and the UK.

“Members of all demographic groups are vulnerable to radicalisation via the internet,” he says. “If you’re in an echo chamber, or listening to people who agree with you, you’re likely to get more extreme. With echo chambers, self-governance becomes much harder because people end up living in different political universes.”

There are now countless examples of such echo chambers leading to terrorist attacks. Elliot Rodger, who killed six people at the University of California in May 2014, was a regular poster on the woman-hating forum PUAhate, and now his own manifesto inspires a new generation of “involuntary celibates” on Reddit. Efforts to tackle this content should be given the same resources as government attempts to curb Islamic radicalisation online.

“Applying the same thresholds to far-right extremist content as Islamist extremist online content is vital in order to stem the flow of far-right propaganda currently surfacing on public platforms,” says Melanie Smith, a researcher at the Institute for Strategic Dialogue, a think tank which works to tackle rising extremism. “I think raising awareness around far-right radicalisation seems like an easy win – the use of appropriate labelling and terminology would be an important first step.”

At the time of writing, there is no conclusive evidence that Alexandre Bissonnette was radicalised online. Many media outlets have run with the fact he “Likes” Donald Trump and Marine Le Pen on Facebook, but he also “Likes” John McCain, “Kindness Matters”, and “International Tom Hanks Day”. “Liking” something doesn’t always mean you actually like it, as many people use the button to follow pages they don’t agree with in order to keep up with the latest news.

Bissonnette’s Facebook “Likes” via Facebook

The way the media has plundered Bissonnette’s Facebook profile – posting pictures of his Halloween costume and trying to glean information from his status updates – could itself fuel online extremism. We create search-engine optimised articles titled “Who was Bissonnette/Breivik/Rodger?” and create a simplified narrative of a lone wolf. We then pilfer shooters’ social media pictures and posts, creating instant heroes for anyone with similar views.

Bissonnette now faces six counts of first-degree murder and five of attempted murder, but no terrorism-related charges. 

Content from our partners
Building Britain’s water security
How to solve the teaching crisis
Pitching in to support grassroots football