Read the full article by Jackie Snow at FastCompany.com

AI-powered face-swapping technology made waves last year when it was used to insert actresses like Daisy Ridley and Gal Gadot in pornographic clips. Experts have called out these so-called “deepfake” videos and ones like them as non-consensual and an invasion of privacy, and raised concerns about a new era of revenge porn against exes or classmates. Now an adult film company wants to use similar tech to offer custom—and consensual—deepfakes.

The tool can do more than blend faces. Individuals or couples could be inserted into a scene with their favorite actress or actor. People with physical limitations could place themselves in sexual situations that would be impossible in real life. While some porn companies have come out against non-consensual deepfakes, Andreas Hronopoulos, the CEO of Naughty America, said the tech has a lot of potential for the industry and shouldn’t be discounted–provided the clips are used for personal satisfaction, not someone else’s humiliation.

“Deepfakes don’t hurt people, people using deepfakes hurt people,” Hronopoulos said.

The world is still figuring out what to do about the risks that come with deepfakes. Sites like Reddit and Twitter have banned face-swapped porn. The Defense Advanced Research Projects Agency (DARPA) is working on a way to detect manipulated video. Adult industry lawyers could work with adult sites to take down deepfake clips on copyright grounds, as they already do in an ongoing battle against pirated clips. (Naughty America says it will pursue takedown requests on deepfakes that use its intellectual property.) But with the technology only getting better and no surefire way to automatically spot it so far, the dire predictions of a future with no privacy and faked videos undermining democracy makes Naughty America’s product seem downright wholesome in comparison.

 

Jackie Snow is a multimedia journalist published in or on National Geographic, Vanity Fair, The Atlantic, Quartz, New York Observer and more.