New Times,
New Thinking.

  1. Science & Tech
2 February 2018updated 24 Jun 2021 12:26pm

Welcome to the age of ‘deepfake’ porn: Your starring role in a sex film is just a few selfies away

In the future everyone will be an involuntary AI-enabled porn star for 15 minutes.

By Mic Wright

Remember that time you filmed a gang bang scene? Or the time you had a threesome with two well-endowed men who, as the porn parlance has it, ran a train on you? You don’t? It doesn’t matter. You don’t need to physically be present anymore to be shanghaied into an involuntary career as a porn star.

Thanks to an easy to use face-mapping app called FakeApp, Reddit and the rest of the internet are awash with clips of practically any (usually female) celebrity you can think of, engaged in a cornucopia of curious sex acts. What’s particularly unsettling here — beyond the obvious complete lack of consent — is the popularity of female celebrities who first found fame as children. Emma Watson and Maisie Williams are “faves” on the most popular subreddits dedicated to the ‘pretend’ porn.

The celebrity angle is tailor-made to have screen shots of these videos plastered all over the websites of The Sun and MailOnline, all dressed up as public interest (aka the public is interested in masturbating over celebrities they otherwise wouldn’t see having sex). But the bigger issue will be as these technologies are picked up by the kind of people for whom revenge porn has long been an attractive weapon.

For you to be turned into a machine-learning enabled porn performer, starring in clips posted to every porn site on the web, an ex won’t need images of you naked. Every selfie you’ve ever posted will be enough. A few hundred photos and any one of us could be convincingly cast in truly unsettling and upsetting scenes. How do you explain to your employer that you didn’t film a sex tape when there’s a clip that convincingly shows your face mapped on to the body of a porn star who has at least a reasonable resemblance to you?

The law has only just caught up with the selfie culture and the pervasive nature of sexting, revenge porn and smartphones in every single person’s hand. Now, legislators will need to get their heads around the new implications for image rights. Your theoretical disgruntled ex will own the rights to photos of you taken by them, but you own your image rights.

Select and enter your email address Your weekly guide to the best writing on ideas, politics, books and culture every Saturday. The best way to sign up for The Saturday Read is via saturdayread.substack.com The New Statesman's quick and essential guide to the news and politics of the day. The best way to sign up for Morning Call is via morningcall.substack.com
Visit our privacy Policy for more information about our services, how Progressive Media Investments may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications.
THANK YOU

The difference though, between you and I, and the celebrities who will now be spinning up their legal teams to issue takedown notices and get “deepfake” dirty videos of them taken down, is that we don’t have those resources. It’s extremely difficult to get videos pulled down and they spread across sites with frightening speed.

It’s probably time for the latest update to Andy Warhol’s infamous aphorism: In the future — the very near future — everyone will be an involuntary AI-enabled porn star for 15 minutes. But the trouble is, those faked clips will hang around for far longer than 15 minutes and it’ll take a lot, lot, longer for you to explain that while it looks like you and sounds like you, it really isn’t. And people wonder why I can’t be bothered to binge on Black Mirror… 

Content from our partners
The Circular Economy: Green growth, jobs and resilience
Water security: is it a government priority?
Defend, deter, protect: the critical capabilities we rely on