Reddit delivered a serious blow to creators of deepfakes with a ban on the primary subreddit for AI-assisted face-swapped videos and images — a community over 90,000 users strong at the time of its demise. In that regard, Reddit is late to the party. Platforms such as Discord, Gyfcat, Pornhub, and even Twitter have already made their anti-face-swap porn policy clear. But while Reddit has finally removed the offending content in question, the damage is done. Pandora’s box is already open, and there’s no stuffing this technology back in.
The morality and the legality of deepfakes are murky issues. Swapping one person’s face onto another’s body is not inherently malicious, but the practice isn’t just used for creating digital stunt doubles. Most deepfakes are pornographic in nature, with users replacing the faces of porn stars with their favorite celebrities. But the tech also enables potential abuse for anyone who puts their face online, who could then end up appearing to star in porn against their will.
Despite creating a new genre of non-consensual fake porn, however, there are many deepfakes creators who believe that their videos aren’t harmful to the people they portray. This curious cognitive dissonance runs rampant throughout the community. While some users on Reddit have vehemently disavowed the practice, likening the pornographic insertion of people’s faces to something like a digital assault, others drew the line at the use of popular political figures like Michelle Obama. Still others argued that celebrities are fair game, while smaller YouTubers and civilians are not. A post on the now-deleted subreddit that debated whether or not it’s okay to deepfake your crush was met with a lot of pushback; another post compared the technology to an episode of Black Mirror. “The general consensus in the comments was that they had some qualms with it, but let their sex drive take over their ethical questions,” the original poster later told The Verge over DM.
A post in the now-deleted subreddit from user Gravity_Horse, addressed to anyone who “opposes” deepfakes and the community at large, includes an acknowledgment that the practice at large is morally questionable. “To those who condemn the practices of this community, we sympathize with you,” Gravity_Horse wrote. “What we do here isn’t wholesome or honorable, it’s derogatory, vulgar, and blindsiding to the women that deepfakes works on.”
But in an email with The Verge, Gravity_Horse adds that the goal of most deepfakes creators is not to harm or defame anyone. “I’m sorry for those who are hurt or betrayed by these creations. But there’s something important to be said here,” Gravity_Horse says. “The technology is only becoming more and more advanced … People are going to be scared. And I genuinely sympathize with them. But since the technology can’t be uninvented, we have to advance with it. I’m far more a proponent of the deepfakes algorithm itself and its potential rather than what it’s currently being used for. But then again, welcome to the internet.”
But the argument that fake porn is not harmful is a flawed one, and claiming that it’s benign because it’s “not real” is a glib self-justification. Sociologist Katherine Cross, who has contributed to The Verge previously, points to the community’s lack of consensus on what is fair to fake — Emma Watson and Michelle Obama, versus your crush down the street — as proof that the practice is more than just an exaggerated fantasy.
“[Deepfake users] understand, intuitively, that this is more real than they want to admit,” she tells The Verge via email. “If it’s all totally harmless and essentially unreal, they wouldn’t mind putting together deepfake porn of people they know. But of course, they do, and it’s because they understand the symbolism of all this. What are the semiotics of a woman, in a pornographic frame, on her knees [giving] a blowjob and why does it make you so uncomfortable to put your mother or sister in that role? Why, then, do it to a woman you don’t know?”
The problem for Cross isn’t with porn itself, but the consequences of forcing someone into such a scenario, authentic or not. “It’s a sort of cheat code for getting around your own morality,” she says. “[They say,] ‘I can hurt people because it’s just a game, and it’s not real hurt.’ But of course, at the same time, you really want to do the thing that’s hurting people because it’s real enough to give you pleasure. Pornographic deepfakes don’t merely exist as abstract art projects; they’re masturbatory aids, to put it bluntly. If they’re real enough for these men to get off, they’re real enough for the people they depict to lodge objections. They have, in a real way, been dragooned into something they didn’t want to do, or wouldn’t do.”