Generative AI technology has been used for the creation of deepfake pornography, which accounted for a whopping 96 percent of all deepfake videos online in 2019.
Deepfakes are generated using a specific form of artificial intelligence known as deep learning. Deep learning algorithms can swap faces in a video or an image with others. Advances in AI technology have made it harder to distinguish between real and deepfake images, further blurring the grey line between fact and fiction.
Services offering to alter images of women into nude photos have risen. Public figures as well as ordinary individuals have been targeted by fake pornography campaigns. One such instance involved the generative AI app Lensa, which was criticized for allowing its system to create fully nude images from profile headshots of users. Any woman can be a victim of this synthetic stripping and have their “nude” images shared on multiple platforms online.
A recent investigation by Kat Tenbarge at NBC News shows just how disturbingly pervasive and accessible nonconsensual deepfake porn has become.
You don’t need to go to the dark web or be particularly computer savvy to find deepfake porn. As NBC News found, two of the largest websites that host this content are easily accessible through Google. The website creators use the online chat platform Discord to advertise their wares and people can pay for it with Visa and Mastercard. Business is booming so much that “two popular deepfake creators are advertising for paid positions to help them create content”.
It comes as no surprise that it’s women who are largely affected by the rise of deepfake porn. A 2019 report by Sensity, a company which detects and monitors deepfakes, found that 96% of deepfakes were non-consensual sexual deepfakes, and of those, 99% featured women. “A creator offered on Discord to make a five-minute deepfake of a ‘personal girl,’ meaning anyone with fewer than 2 million Instagram followers, for $65,” NBC reports.
As far back as 2017, stars such as Scarlett Johansson and Taylor Swift were consulting lawyers when they realised their faces were being digitally morphed onto the bodies of porn stars.
The rise of deepfake porn is already having a massive impact on ordinary people’s lives. The rise of AI-generated imagery has taken that harassment to hideous new heights. This isn’t just porn, it’s terrorism; it’s meant to punish and silence women.
One of the women allegedly targeted by the deepfake porn, streamer QTCinderella, spoke out about the toll it had taken on her mental health. “This is what it looks like to feel violated, this is what it looks like to feel taken advantage of,” she said in a 30 January live stream. “This is what it looks like to see yourself naked against your will being spread all over the internet,” she said.
Increasingly, this is the reality of being a woman who is in the public eye even the smallest bit. The rise of AI-generated imagery has taken that harassment to hideous new heights. This isn’t just porn, it’s terrorism; it’s meant to punish and silence women.
A BBC documentary, Deepfake Porn, examined how souped-up versions of the same technology were now being used to turn ordinary women into porn stars. The programme highlighted a growing band of enthusiasts (nearly always men) whose 'hobby' was uploading images of work colleagues, even family members, for digital manipulation into porn.
The documentary featured Dina, who lives in Brighton and previously worked in the video gaming industry. She was sent extremely explicit pornographic movies in which she seemed to be the 'star'.
'Who would do this, and why? Does someone hate me? Or think this is funny? ' she asks.
The content had been created by a work colleague, someone she regarded as a friend. The discovery left her shellshocked.
Earlier this year a Twitch streamer called Brandon “Atrioc” Ewing admitted to buying and watching deepfake porn of his female colleagues that caught global attention.
After the Twitch controversy, for example, Google searches for deepfake porn has boomed. As it is, the commercial application of synthetic media will grow due to its popularity on social media and the internet.
Lawmakers and Technology companies should start acknowledging the problem of nonconsensual deepfake porn as an emergency and hold its creators and facilitators to account. Or, a lot of lives are going to get ruined.