Tech

‘You Feel So Violated’: Streamer QTCinderella Is Speaking Out Against Deepfake Porn Harassment

​Image courtesy QTCinderella

The last thing popular Twitch streamer QTCinderella wants to be doing right now, she told me during a call, is giving interviews about her ongoing sexual harassment.

Two weeks ago, Twitch streamer Brandon Ewing, who goes by Atrioc online, inadvertently showed his open browser tabs while doing a live stream in front of some of his 318,000 followers, revealing that he visited a website selling deepfakes—AI-generated, non-consensual pornographic videos—of fellow streamers. Screenshots then circulated the name of the website, the images, and the names of more than a dozen other female streamers whose likenesses appeared on the site, including QTCinderella.

Videos by VICE

On January 30, he made a tearful apology live on his Twitch stream, admitting that he’d followed a Pornhub advertisement for deepfake porn to a website where he bought access to those images, and on February 1, he posted a longer apology to Twitter, writing, “My actions have taken me from someone I was proud of, trying to make a positive impact in my community, to a ‘deepfake porn guy.’ The scar of that is felt deeply on my heart.” The creator of the deepfakes, responding to the apology and ensuing discourse, wiped their presence from the web and promised not to return.

It was too late, though—the discourse had reached even right-wing pundits like Ben Shapiro, and for many of the women whose names and images were leaked and spread all over the internet, the repercussions have been horrific. “I think last week was the hardest,” QTCinderella, who is also known as Blaire, told Motherboard. “I’m not fine, whatever fine means. I’m still sad. I think it’s still hard to take in.”

After Ewing’s slip on the livestream, people immediately started sending her non-consensual pornographic images of herself. “I was already getting DMs of the photos and replies in my tweets before I even had a full grasp of what the hell was going on,” she said. “I think a big issue is that he apologized to the stream before he apologized to any of the women. So we were fucking blindsided.”

The harassment has been relentless, she said. “You go to my YouTube comments, you search my name, you do anything. It’s just there.” Many of the other women whose names and images were shared from the deepfake website Ewing visited, including Pokimane and Maya Higa, are also enduring a deluge of harassment and unwanted attention; last week Pokimane spoke about it on stream, addressing comments that she somehow deserved to appear in deepfake pornography because she posts selfies. “It makes no difference what you post or what you do,” she said.

Not only is speaking out about online harassment exhausting, but speaking out about image-based abuse virtually guarantees that the harassment will escalate.

The now-ubiquitous AI-generated porn phenomenon started in late 2018, when Motherboard found an anonymous Redditor’s programming hobby that was quickly going viral: a user named deepfakes posted algorithmically-generated, face-swapped images of celebrity women’s faces onto porn performers’ bodies, making it seem like they were nude or appearing in sex scenes that never happened. Since then—and despite still-unfounded fears of political disinformation and media manipulation—deepfakes have primarily been used to harass women online.

Streamers and cosplayers are already frequent targets of misogyny online, but AI-generated sexual harassment takes it to another level. It’s used to slut-shame and smear them, as blackmail, and can upend their lives within their own families and real-life communities. Streamers like QTCinderella and others who were targeted in last week’s leak are accustomed to this behavior from people online, but that doesn’t make it easier to cope with.

“It’s just another reminder of the retribution you have to pay in order to be open online,” QTCinderella said. “And the sad thing is, I’ve never been a bitter person until I started streaming and slowly, it consumed me, because it’s just exhausting.” Since seeing the AI-generated images, she’s dealt with body dysmorphia, a resurgence of an eating disorder, and resurfaced trauma from her past, she told me.

“You feel so violated…I was sexually assaulted as a child, and it was the same feeling,” she said. “Like, where you feel guilty, you feel dirty, you feel like, ‘what just happened?’ And it’s bizarre that it makes that resurface. I genuinely didn’t realize it would.”

Targets of sexually explicit non-consensual deepfakes say that although they know that the images are the product of an algorithm, the effect they can have psychologically is as real as if it were an actual leaked video of themselves. “It is so convincingly my body, but not my body, and holy shit, my body will never be as perfect as that girl’s body,” QTCinderella said. “My body will never be as skinny, I’ll never be as perky as that body. At least in my eyes.”

Her family has also seen the images, she said, but she hasn’t talked to him about it yet. “For my 65-year-old dad, it’d be a hard time explaining to him that that’s not real… I’ve raised tons of money for charity, made community events and tried to highlight people that maybe haven’t had the opportunity to be highlighted. I’ve done so much. But this is what my family now knows as my job.”

Part of the aftermath of this becoming a viral news story is how some commentators have twisted the narrative, she said. Right-wing pundit Ben Shapiro featured QTCinderella’s reaction stream on his Twitch show, using her pain to advance his opinion that all pornography is exploitative. QTCinderella strongly disagrees.

“I am not opposed to sex work. I just don’t want to be a sex worker. That’s it. I think sex work is great. I support sex work, and it’s been sad to see some of these narratives shift… The problem is consent. I don’t want to be the face of anti sex work because I’m very pro sex work. It’s just fucking miserable that people can take whatever they want from you and turn it into whatever they want.”

“It feels like swimming upstream with a harpoon already in you.”

In a Twitch stream on the same day Ewing apologized, QTCinderella vowed to sue the creator of the deepfakes. Shortly after the content went viral, the person took all of it down from their own page and posted a lengthy apology note to the page where they were selling the content, saying that “after seeing the situation of that couple apologizing and a few streamers’ reactions who thought [I] ‘did not care’, I feel like the total piece of shit I am.”

But the damage was already done; once the images and videos were online, people sharing and leaking them from their original source meant that they’ll likely be online forever.

QTCinderella told me that she’s talked to several lawyers about legal recourse, and the consensus is that there’s little to be done at this point. Several states, including California, Virginia, and Texas, have laws against creating and disseminating malicious deepfakes, and most states have penalties for spreading non-consensual pornography. But the costly, time-consuming burden of pursuing legal action against a harasser falls on the victims, and because most of the people sharing abusive images online are doing so anonymously, it can be extremely difficult to get justice.

“The goal, legally, would be to get the website taken down and we’ve already got it taken down,” she said. “It feels like swimming upstream with a harpoon already in you.”

Speaking up about harassment and industry abuses is a double-edged sword for marginalized people online—which is why many women, including some of the most popular online personalities in the world, choose not to comment on the ways harassment infiltrates their daily lives. The spectacle of Ewing’s actions, his apology, and its aftermath have forced the conversation into the open. The shock of it coming from a person who has proclaimed support for women online, and is close friends and colleagues with many of the women affected, has added to the intrigue.

QTCinderella doesn’t want this to be part of her narrative. “But it’s the only option I have to hopefully do something about in the future,” she said. “I really don’t want to talk about it. However, I hope that in 10 years when my nieces are more on the internet, she doesn’t have to deal with something like this just for existing as a woman on the internet. It shouldn’t be a price you pay. It shouldn’t be the fee that you have whenever you log in. That’s not fair.”