21 Mar 2024

Exclusive: Hundreds of British celebrities victims of deepfake porn

Presenter

Hundreds of female British actors, TV stars, musicians, YouTubers and journalists are victims of deepfake pornography, a Channel 4 News investigation being shown tonight has found.

At least 250 British celebrities appear in the deepfake videos, in which their faces are superimposed onto pornography using Artificial Intelligence.

Channel 4 News is not naming those affected. Channel 4 News contacted more than 40 celebrities all of whom were unwilling to comment publicly.

Channel 4 News presenter, Cathy Newman, is among the victims. In her report, Cathy responded to the video of her: “It feels like a violation. It just feels really sinister that someone out there who’s put this together, I can’t see them, and they can see this kind of imaginary version of me, this fake version of me.”

“You can’t unsee that. That’s something that I’ll keep returning to. And just the idea that thousands of women have been manipulated in this way. It feels like an absolutely gross intrusion and violation.

“It’s really disturbing that you can, at a click of a button, find this stuff, and people can make this grotesque parody of reality with absolute ease.”

The growth of deepfake pornography has been exponential, led in part by advances in AI technology and easy-to-access apps available online.

In 2016, researchers identified just a single deepfake porn video online. In the first three quarters of 2023 alone 143,733 new deepfake porn videos were uploaded online – more than all the previous years combined. It means there are millions of victims worldwide.

The videos are attracting large volumes of views. Independent research by an industry analyst, shared with Channel 4 News, found the 40 most visited deepfake pornography sites received a combined total of 4.2 billion views.

A Channel 4 News analysis of the most visited deepfake websites found almost 4,000 famous individuals were listed – of which 255 were British.

More than 70% of visitors to the top five deepfake porn sites were via search engines, such as Google.

Earlier this year explicit deepfake images of Taylor Swift were posted on X, formerly Twitter. They were viewed around 45 million times before the platform took them down.

Facebook and Instagram also reportedly ran ads showing blurred deepfake sexual images of the actress Jenny Ortega when she was just 16. Meta has since removed them.

Elena Michael, a campaigner from the group NotYourPorn, told Channel 4 News: “Platforms are profiting off this kind of content. And not just porn companies, not just deepfake porn companies, social media sites as well. It pushes traffic to their site. It boosts advertising.

“There’s lots of different ways, even having users on platforms that, you know, they may play a different role and they may buy products on their site, but they’re still perpetrating abuse in another part of that role that they play on whatever social media site that profiting from that and that shouldn’t be acceptable.”

Despite the proliferation of deepfake videos targeting celebrities, the most targeted women are private individuals. Independent research shared with Channel 4 News found hundreds of thousands of images and videos of non-famous people posted on 10 websites in 2023.

Most image creation is done using apps, with the number of so-called ‘undressing’ or ‘nudifying’ apps soaring to more than 200.

Channel 4 News interviewed 31 year-old Sophie Parrish, a mum of two from Merseyside, who discovered deepfake nude images of her had been posted online. She described the impact they had on her life and her family:

“It’s just very violent, very degrading. It’s like women don’t mean anything, we’re just worthless, we’re just a piece of meat. Men can do what they like. I trusted everybody before this. My wall was always down but now I don’t trust anybody.

“My eldest, he’ll say, what did the nasty man do to upset you, mummy? Will you ever tell me. Because he overnight watched his mum go from being a happy person to this person who cried most days and got angry very quickly and was just a complete shell of the person she was before.”

Since January 31 this year under the Online Safety Act, sharing unconsented deepfake imagery is illegal, but the creation of the content is not. Individuals commit an offence if they share deepfake porn without consent.

Online safety regulation has been placed in the hands of broadcasting watchdog Ofcom, but consultation is still ongoing as to how the new legislation will be enforced and applied.

Campaigners and legal experts speaking to Channel 4 News criticised the watchdog’s draft guidance as weak because it doesn’t put pressure on big tech platforms that facilitate the hosting and dissemination of deepfake porn.

An Ofcom spokesperson told Channel 4 News: “Illegal deepfake material is deeply disturbing and damaging. Under the Online Safety Act, firms will have to assess the risk of content like this circulating on their services, take steps to stop it appearing and act quickly to remove it when they become aware. Although the rules aren’t yet in force, we are encouraging companies to implement these measures and protect their users now.”

Google, Meta and X declined to be interviewed. In a statement a Google spokesperson told Channel 4 News: “We understand how distressing this content can be, and we’re committed to building on our existing protections to help people who are affected.

“Under our policies, people can have pages that feature this content and include their likeness removed from Search. And while this is a technical challenge for search engines, we’re actively developing additional safeguards on Google Search – including tools to help people protect themselves at scale, along with ranking improvements to address this content broadly.”

Ryan Daniels from Meta said: “Meta strictly prohibits child nudity, content that sexualizes children, and services offering AI-generated non-consensual nude images. While this app remains widely available on various app stores, we’ve removed these ads and the accounts behind them.”

The Department for Science, Innovation and Technology told us: “We are protecting women and girls by cracking down on abusers who share, or threaten to share, manipulated or manufactured intimate photos or videos without consent. This includes giving police and prosecutors the powers they need to bring cowards who share these photos and videos to justice with penalties of up to two years imprisonment.

They said, “The Online Safety Act has also made sharing deepfake intimate images of another person without consent illegal. Once implemented, the Act will place groundbreaking new duties on social media platforms to stop illegal content being shared on their sites, or they risk facing fines that could reach billions of pounds.”

Got a story? C4Nstories@itn.co.uk