Facebook’s biggest UK political party is no more. The social media site has banned Britain First, the fringe far-right political party, which, despite having no elected MPs, MEPs or even councillors, amassed more than two million Likes on its page.
The ban is the most visible move to date that social networks are keen to be seen to be taking action against extremist content among a political backlash against the tech giants from countries across Europe, and the US itself.
It follows a similar ban of the party’s leaders from Twitter earlier this year, after President Donald Trump retweeted anti-Muslim videos posted by Britain First’s deputy leader.
Facebook’s move takes out one of the most powerful distribution channels for anti-Muslim content online. The page used quite sophisticated social media strategies to spread its message, posting inoffensive patriotic imagery – support our armed forces; oppose animal cruelty – to reach a wide audience, while thrpwing more explicit anti-Muslim posts into the mix.
This blend of content was itself dangerous, serving to normalise anti-Muslim views among a huge audience of casual Facebook users, many of whom were older adults. Last year, we analysed more than one million Likes on Britain First posts – about six weeks’ worth – for BuzzFeed News, finding that, while relying on a hardcore of several hundred users, the page worked successfully to reach a large pool of casual viewers, some of whom would likely be unaware of the group’s motivations.
This made the public Britain First page a powerful tool for reaching potentially sympathetic would-be recruits, but also in generating an active core membership – a power Facebook clearly recognised with its decision to ban the group.
But we shouldn’t be fooled into thinking we can tackle the rise of populism with a scattergun technological fix. The social pressures behind the popularity of such groups won’t change, and so, without clear policies, Facebook risks political incoherence and accusations of censorship.
Britain First and the material they post have been extensively covered in the mainstream media for the past 18 months, yet they were allowed to continue posting more. Facebook must explain why such posts were considered acceptable over this period, before suddenly becoming unacceptable now. The far right is talented at exploiting “censorship” to its own advantage, claiming it is speaking the truths that those in power do not want to hear.
That doesn’t mean the group should have been allowed to continue on Facebook, but it does mean the limitations of speech are on each social network should be set out clearly and in detail.
This is particularly important because Britain First’s Facebook presence was just the most visible part of a far-right Facebook ecosystem – the nastiest content is much harder to see, hidden away in closed groups which admit new users by invitation only.
Because such groups – which often go by names such as “NO SHARIA LAW” or similar – are hidden, it is much harder to track their activity and their membership, but they number in the hundreds and some have thousands or hundreds of thousands of members. While Britain First might be the visible portion of anti-Muslim Facebook content, its these groups that likely pose the larger challenges, especially as it is not in the open where it can be challenged.
Going further, tackling the public groups helps disrupt the feed of users who could be radicalised into becoming active members of the far-right, but could serve to further radicalise those already within the private groups. There is a delicate balancing act to be tackled, and one which serves to show how important Facebook is now in public policy debates: in practical terms, a US technology company is now more influential than government policy when it comes to online extremism.
It will be a welcome relief to many that Britain First content won’t pollute their feeds any longer – but it highlights how much power we have delegated, how much Facebook can shape our rules, and how tech is running ahead of our laws and our own social decisions.
Banning Britain First from Facebook might be a move many of us like – but we shouldn’t rely on big tech to save us from populism, and its accompanying tide of racism. These are conversations we should be having – and battles we should be fighting – as a society.