Deepfakes Were Created As a Way to Own Women’s Bodies—We Can’t Forget That

Illustration Frau mit Maus

This story appears in VICE magazine and Broadly’s 2018 Privacy and Perception Photo Issue. Click HERE to subscribe to VICE magazine.

When we first found Reddit user “Deepfakes”—a play on deep machine learning, a branch of artificial intelligence, and fake photos—he was in a celebrity doppelgänger porn forum quietly going about his hobby: creating the videos that would eventually take on his name and become a symbol of Truth’s deterioration.

Videos by VICE

There he was, out in the virtual wide open, diligently cranking out fictional footage of women through a machine-learning meat grinder and posting them to public porn forums. And holy shit did they look believable.

For the blissfully uninitiated, deepfakes are AI-generated videos in which a person’s face is attached to someone else’s body. They began as Frankenporn made from mash-ups of celebrity faces and porn performer bodies, creating pretty realistic videos of people having sex that never happened.

Deepfakes first posted his creations to Reddit toward the end of 2017, and by December, my editors at Motherboard, VICE’s science and tech outlet, and I tracked him down and published the first in-depth article on the phenomenon. Come mid January 2018, people in deepfakes-focused subreddits had made a user-friendly application called FakeApp for running the algorithm the job required. Soon, it seemed like every fake porn enthusiast was making their own AI-generated sex tapes with varying success.

Continue reading on VICE.