New Times,
New Thinking.

  1. Science & Tech
30 March 2021updated 21 Sep 2021 6:13am

Are just 12 people responsible for spreading more than half of all Covid-19 vaccine disinformation?

The “Disinformation Dozen” are prolific in the online anti-vaxx movement – but Facebook and Twitter don’t appear to be trying to supress them.

By Sarah Manavis

When we think about disinformation – the deliberate dissemination of false or misleading information – we tend to view it as a large-scale problem. We believe it must be an insidious movement involving millions of people, all inventing falsehoods, rather than something spread by a small number of committed individuals. 

Disinformation, and subsequently misinformation – false claims that may be shared without the sharer realising they are false – is often regarded as a problem that is out of control because it is so widespread; a problem that has become yet more apparent and unavoidable during the pandemic, when anti-vaxx misinformation about Covid-19 vaccines turned mainstream overnight.

But a new report from the Centre for Countering Digital Hate (CCDH) suggests the source of vaccine disinformation is much more contained than we might assume. After analysing 812,000 anti-vaxx posts on Facebook and Twitter between February and March this year, the CCDH found 65 per cent came from the same 12 accounts, dubbed the “Disinformation Dozen”: a group of highly influential anti-vaxxers operating predominantly on Facebook, Twitter and Instagram. 

Most of the 12 accounts have long been involved in peddling pseudoscience, and have created whole business empires built around alternative medicine. They include Joseph Mercola, an “anti-vaccine entrepreneur”, who markets dietary supplements and medical equipment and has more than 1.7 million followers on Facebook. He has received numerous warnings from the United States Food and Drug Agency that his products falsely advertise that they “are intended for use in the cure, mitigation, treatment, or prevention of diseases”. Another is Robert F Kennedy Jr, a nephew of John F Kennedy, who has a well-recorded history of pushing anti-vaxx theories. 

[see also: How anti-vaxxers capitalised on coronavirus conspiracy theories]

The other accounts include long-standing conspiracy theorists and anti-vaxx-focused physicians. Some are in personal and professional partnerships with each other, and have connected social media profiles. Others, such as Rizza Islam – who encourages vaccine hesitancy among an African-American audience – adopt a more targeted approach.

These influencers are popular and well known in the anti-vaccination world, but few have been removed from Facebook, Instagram or Twitter. Only three have been permanently removed from any of these platforms, and were banned from just one of the sites, meaning they can still operate freely on the other two.  

Give a gift subscription to the New Statesman this Christmas from just £49

Why, then, have Facebook and Twitter not removed this small but powerful handful of easily identifiable individuals?

The two platforms say they are committed to removing coronavirus misinformation from the sites and have adopted formal systems of permanently banning accounts who share this kind of content repeatedly. And yet, over the last year, the Disinformation Dozen have only seen their online followings grow. Moreover, the CCDH also found that both Facebook and Twitter failed to take down around 95 per cent of reported coronavirus misinformation. 

“Facebook, Google and Twitter have put policies in place to prevent the spread of vaccine misinformation yet, to date, all have failed to satisfactorily enforce those policies,” Imran Ahmed, the CEO of the CCDH, said. “All have been particularly ineffective at removing harmful and dangerous misinformation about coronavirus vaccines, though the scale of misinformation on Facebook, and thus the impact of their failure, is larger.”

Some platforms are more culpable than others. On Facebook, the CCDH found the Disinformation Dozen accounted for 73 per cent of all coronavirus misinformation. Alternatively, on Twitter, they make up 17 per cent of it. (This is still a significant amount of influence for just 12 people.) The CCDH also noted that Instagram not only failed to remove most anti-vaxx content, but in some cases also actively recommended further misinformation about vaccines to users.

With more and more studies highlighting the efficacy of coronavirus vaccines in preventing severe illness and death, Ahmed emphasised the critical importance of addressing vaccine misinformation and disinformation. “Our recovery depends on the public’s willingness to receive a vaccine,” he said. “Misinformation disseminated via social media is [increasingly linked to] vaccine hesitancy, which will ultimately cause unnecessary deaths.”

[see also: How vaccine hesitancy could undermine the UK’s success]

Content from our partners
Building Britain’s water security
How to solve the teaching crisis
Pitching in to support grassroots football

Topics in this article :