Anyone who’s spent a moment observing US political chatter in recent months has probably encountered the following prediction: 2024 will yield the world’s first deepfake election. Rapidly evolving AI video and audio generators powered by large language models have already been used by both Donald Trump and Ron DeSantis’ presidential campaigns to smear each other and fakes of current President Joe Biden seem to proliferate on a regular basis. Nervous lawmakers—possibly worried their faces may also soon wind up sucked into the AI-generated quagmire—have rushed to propose more than a dozen bills trying to reign in deepfakes on the state and federal levels.

Bruce Willis is Keeping His Face

But fingernail-chewing lawmakers are late to the party. Deepfakes targeting politicians may seem new, but AI-generated pornography, which still makes up the overwhelming majority of nonconsensual deepfakes, has tormented thousands of women for over half a decade, their stories often buried beneath the surface of mainstream concerns. A group of deepfake victims are attempting to lift that veil by recounting their trauma, and the steps they’ve taken to fight back against their aggressors, in a shocking new documentary called Another Body.

One of those women targeted, a popular ASMR streamer with 280,000 followers on Twitch named Gibi, spoke with Gizmodo about her decision to publicly acknowledge the sexual deepfakes made of her. She hopes her platform can help shine a spotlight on the too-often overlooked issue.

“I think we spend most of our time convincing ourselves that it’s not a big deal and that there are worse things in the world,” Gibi said in an interview with Gizmodo. “That’s kind of how I get by with my day, is just you can’t let it bother you, so you convince yourself that it’s not that bad.”

“Hearing from other people that it is that bad is a mix of emotions,” she added. “It’s a little bit relieving and also a little bit scary.”

Gibi was one of several women in the film who recount their experiences after finding deepfakes of themselves. The documentary, directed by filmmakers Sophie Compton and Reuben Hamlyn, largely follows the life of an engineering college student named Taylor who discovered deepfake pornography of herself circulating online. Taylor isn’t the student’s real name. In fact, all appearances of Taylor and another deepfake victim presented in the documentary are actually deepfake videos created to conceal their true identities.

Image for article titled Victims of Deepfakes Are Fighting Back

Image: Another Body

The 22-year-old student discovers the deepfake after receiving a chilling Facebook message from a friend who says, “I’m really sorry but I think you need to see this.” A PornHub link follows.

Taylor doesn’t believe the message at first and wonders if her friend’s account was hacked. She ultimately decides to click on the link and is presented with her face engaged in hardcover pornography staring back at her. Taylor later learns that someone pulled images of her face from her social media accounts and ran them through an AI model to make her appear in six deepfaked sex videos. Making matters worse, the culprit behind the videos posted them on a PornHub profile impersonating her name with her real college and hometown listed.

The, at times, devastatingly horrific film lays bare the trauma and helplessness victims of deepfakes are forced to endure when presented with sexualized depictions of themselves. While most conversations and media depicting deepfakes focus on celebrities or high-profile individuals in the public eye, Another Body illustrates a troubling reality: Deepfake technology powered by increasingly powerful and easy-to-access large language models means everyone’s face is up for grabs, regardless of their fame.

Rather than conclude on a grim note, the film spends the majority of its time following Taylor as she unravels clues about her deepfakes. Eventually, she learns of another girl in her school targeted by similar deepfakes without her consent. The two then dive deep into 4Chan and other hotbeds of deepfake depravity to discover any clues they can to unmask their tormentor. It’s during that descent in the depth of the deepfake underground that Taylor stumbles across faked images of Gibi, the Twitch streamer.

Twitch streamer speaks out

Gibi, speaking with Gizmodo, said she’s been on the receiving end of so many deepfake videos at this point she can’t even recall when she spotted the first one.

“It all just melds together,” she said.

As a streamer, Gibi has long faced a slew of harassment beginning with sexualized text messages and the more-than-occasional dick pic. Deepfakes, she said, were gradually added into the mix as the technology evolved.

In the beginning, she says, the fakes weren’t all that sophisticated but the quality quickly evolved and “started looking more and more real.”

But even obviously faked videos still manage to fool some. Gibi says she was amazed when she heard of people she knew falling for the crude, hastily thrown-together early images of her. In some cases, the streamer says she’s heard of advertisers severing ties with other creators altogether because they believed they were engaging in pornography when they weren’t.

“She was like, ‘That’s not me,’” Gibi said of her friend who lost advertiser support due to a deepfake.

Gibi says her interactions with Taylor partially inspired her to release a YouTube video titled “Speaking out against deep fakes” where she opened up about her experiences on the receiving end of AI-generated manipulated media. The video, posted last year, has since attracted nearly half a million views.

“Talking about it just meant that it was going to be more eyes on it and be giving it a bigger audience,” Gibi said. “I knew that my strength lay more in the public sphere, posting online and talking about difficult topics and being truthful, much work.”

When Gibi decided to open up about the issue she says she initially avoided reading the comments, not knowing how people would react. Thankfully, the responses were overwhelmingly positive. Now, she hopes her involvement in the documentary can draw in even more eyeballs to potential legislative solutions to prevent or punish sexual deepfakes, an issue that’s taken a backseat to political deepfake legislation in recent months. Speaking with Gizmodo, Gibi said she was optimistic about the public’s renewed interest in deepfakes but expressed some annoyance that the brightened spotlight only arrived after the issue started impacting more male-dominated areas.

“Men are both the offenders and the consumers and then also the people that we feel like we have to appeal to change anything,” Gibi said. “So that’s frustrating.”

Those frustrations were echoed by EndTAB founder Adam Dodge, who also makes several appearances in Another Body. An attorney working in gender-based violence for 15 years, Dodge said he founded EndTab in order to empower victim service providers and educate leaders about the threats posed by technology used to carry out harassment. Taylor, the college student featured in the film, reached out to Dodge for advice after she discovered her own deepfakes.

Speaking with Gizmodo, Dodge said it’s important to acknowledge that online harassment isn’t really new. AI and other emerging technologies are simply amplifying an existing problem.

“People have been using nude images of victims to harass or exert power and control over them or humiliate them for a long time,” Dodge said. “This is just a new way that people are able to do it.”

Deepfakes have altered the equation, Dodge notes, in one crucial way. Victims no longer need to have intimate images of themselves online to be targeted. Simply having publicly available photos on Instagram or a college website are enough.

“We’re all potential victims now because all they need is a picture of our face,” Dodge said.

Even though his organization is primarily intended for training purposes, Dodge says victims would seek him out looking for help because he was one of the few people trying to raise awareness about the harms early on. That’s how he met Taylor.

Speaking with Gizmodo, Dodge expressed similar frustrations with the scope of some emerging deepfake legislation. Even though the overwhelming majority of deepfakes posted online involve nonconsensual pornography of women, Dodge estimates around half of the bills he’s seen proposed focus instead on election integrity

“I think that’s because violence against women is an issue that is never given proper attention, is consistently subverted in favor of other narratives, and legislators and politicians have been focused on deepfake misinformation that would target the political sphere because it is an issue that affects them personally,” he said. “Really, what we’re talking about is a privilege issue.”

Deepfakes are consuming the internet

Sexual deepfakes are proliferating at an astounding clip. An independent researcher speaking with Wired this week estimates some 244,625 videos have been uploaded to the top 35 deepfake porn websites over the past seven years. Nearly half, (113,000) of those videos were uploaded during the first nine months of this year. Driving home the point, the researcher estimates more deepfaked videos will be uploaded by the end of 2023 than all other years combined. That doesn’t even include other deepfakes that may exist on social media or in a creator’s personal collections.

“There has been significant growth in the availability of AI tools for creating deepfake nonconsensual pornographic imagery, and an increase in demand for this type of content on pornography platforms and illicit online networks,” Monash University Associate Professor Asher Flynn said in an interview with Wired. “This is only likely to increase with new generative AI tools.”

Dejecting as all of that may sound, lawmakers are actively working to find potential solutions. Around half a dozen states have already passed legislation criminalizing the creation and sharing of sexualized deepfakes without an individual’s consent. In New York, a recently passed law making it illegal to disseminate or circulate sexually explicit images of someone generated by artificial intelligence takes effect in December. Violators of the law could face up to a year in prison.

“My bill sends a strong message that New York won’t tolerate this form of abuse,” state senator Michelle Hinchey, the bill’s author, recently told Hudson Valley One, “Victims will rightfully get their day in court.”

Elsewhere, lawmakers on the federal level are pressuring AI companies to create digital watermarks that will clearly disclose to the public when media has been altered using their programs. Some major companies involved in the AI race, like OpenAI, Microsoft, and Google, have voluntarily agreed to work towards a clear watermarking system. Still, Dodge says detection efforts and watermarking only address so much. Pornographic deepfakes, he notes, are devastatingly harmful and create lasting trauma even when everyone knows they are fake.

Even with nonconsensual deepfakes poised to skyrocket in the near future, Dodge remains shockingly and reassuringly optimistic. Lawmakers, he said, seem willing to learn from their past mistakes.

“I still think we’re very early and we’re seeing it get legislated. We’re seeing people talk about it,” Dodge said. “Unlike with social media being around for a decade and [lawmakers] not really doing enough to protect people from harassment and abuse on their platform, this is an area where people are pretty interested in addressing it across all platforms and whether legislative law enforcement, tech in society at large.”

Read More

President

View all posts