
The past year has been a wake-up call about the prevalence and sophistication of deepfakes. Be it the fake porn created using Taylor Swift’s likeness that spread across social media, or deepfake audio of Sadiq Khan speaking about the war in Gaza, AI-generated content is becoming more convincing – and dangerous. In what looks to be an election year in both the US and the UK, the threat such images pose to our democracy feels more tangible than ever (deepfakes of Joe Biden and Donald Trump are everywhere – both Rishi Sunak and Keir Starmer have already been targeted).
Politicians and global celebrities are the people we spend the most time saying are at risk of deepfakes. But another demographic is being targeted more than any other: social media influencers, particularly women. When the social media agency Twicsy conducted a survey of more than 22,000 influencer accounts on Twitch, TikTok, Instagram, YouTube and Twitter/X in March, they found that 84 per cent had been the victims of deepfake pornography at least once (89 per cent of the deepfakes found were of female influencers). These weren’t small accounts – each had a five-figure follower count. And in the space of just one month, some of these deepfakes had received more than 100 million views.