The Deep, Dark World Of Fake Porn (Huffington Post)

revenge-porn-x.jpeg

Read the full article by Madeline Buxton at huffingtonpost.com

Which is worse: having nude or intimate photos of yourself hacked and shared online, or having your face digitally placed on a porn star’s body so it looks as if you’re performing X-rated acts? That’s a rhetorical question, because in both cases, if you haven’t consented, it’s a lose-lose situation. Right now, both of these unsavory scenarios are affecting women in Hollywood, showing a dark side of the web and the nightmarish ways people are using new technologies for harmful means. In early December, Vice’s Motherboard first reported on a specific kind of porn video, which was distributed on Reddit by a user who went by the name “deepfakes.” Using open source machine learning algorithms — ones that are free and available to the public — deepfakes created realistic-looking fake porn that placed celebrities’ faces on porn stars’ bodies. Gal Gadot, Daisy Ridley and Margot Robbie were among the most high-profile actresses whose images were used in the films, which continued to proliferate under a subreddit named after the original creator.

The nonconsensual spread of deepfake videos has been compared to 2014′s “The Fappening,” the major hacking and release of celebrities’ nude photos. Besides the medium, there’s one obvious difference: Deepfake videos are not actually real (though it can be disturbingly hard to tell) and by splicing together two forms of media — footage from celebrities and footage from real porn videos — their harm extends beyond Hollywood, and has the potential to affect any woman with a large number of high-quality photos publicly available online.

Previous
Previous

Is Porn Actually a Public Health Risk? Florida Seems to Think So (Men's Health)

Next
Next

FOSTA Would Be a Disaster for Online Communities (EFF)