You can buy an image of an algorithmically-generated nude photo of a woman, standing like she’s posing for a mugshot, for $1. If that’s your thing.
A startup called These Nudes Do Not Exist thinks this could be a groundbreaking evolution in the world of porn. It offers images of women who, like these cats, these people, these feet, and these diverse stock models, do not exist in the real world.
Videos by VICE
“We knew that we had the technical expertise and self-funding capabilities to succeed in bringing GAN technology to nude content, and wouldn’t find ourselves in a crowded space with a lot of competition, and as such the opportunity was pretty attractive,” one of the founders of These Nudes Do Not Exist (TNDNE), who requested to remain anonymous because he and his partner didn’t want to be publicly associated with their own creation, told me.
To buy an image of one of these algorithmically-generated women, you can smash a button that says “Generate New Girl” and cycle through a series of bodies that all look strikingly the same: white, bland smile, average-sized breasts and slim builds. When you land on one you like, you add her to your cart, and are issued a “seed” number, which TNDNE says is your proof that you own that specific model.
“I think this is probably the first chance that anyone in the world has ever had to buy AI generated pornographic content, so in a sense each customer gets to be a part of porn and AI history,” the co-founder said. “Long term however, the goal is to have complete 3D rendered AI generated models capable of creating custom photo and video content.”
It’s worth questioning why a still image of a shirtless woman from the torso-up is considered “pornographic.” TNDNE says it plans to expand to include more poses and video options, eventually, and that the pictures currently for sale “mostly serve as a novelty.”
To create the nudes, the website uses Generative Adversarial Networks, an algorithm trained on lots and lots of images—in this case, nudes—in order to produce new, novel versions of what it sees as “nude women.”
TNDNE’s co-founder wouldn’t tell me what specific datasets the algorithm is trained on, but did say that the current database is entirely women, mostly 20-40 years old, and white.
“That wasn’t because of any choice on our part so much as it was just because that’s how the well classified datasets ended up shaking out,” he said. “We were very careful to use only public domain or purchased data sources from reputable providers. While we will add men in the future, the truth is there’s not a lot of demand for male nude pictures.”
Not seeing a high demand for male photos is the same reasoning the creator of DeepNude, an app that algorithmically undressed images of women, gave for only offering non-consensual nude functionality of women’s bodies.
Many of the images and datasets found online of nude and pornographic imagery, even those marked as public domain, are frequently stolen from actual sex workers. People steal sex worker’s content all of the time, posting it to tube sites for free or dumping it into database links. Even big tech companies struggle with this: like IBM, which got into trouble for scraping people’s personal Flickr images marked for creative commons use, and Microsoft, which took down the world’s largest image dataset, MS Celeb, after reports revealed it consisted of photos of people without their consent.
For nude images in particular, machine learning engineers struggle to find datasets for training. Some scrape Reddit or Pornhub to get the images, but NSFW images on both of these sites frequently include non-consensual imagery, even if it’s posted from an account made to seem like that person is doing it themselves. In short, this is a really, really tough undertaking—that TNDNE seems to think it can tackle with some reverse-image searching.
“The verification process for public domain [images] centers around running public domain data through reverse image searches,” the co-founder said. “If we notice that the results are from paywalled/monetized websites, revenge porn websites, online forums, or behind paywalls, we err on the side of caution and exclude that data since it may not have been gathered ethically.”
This would mean individually searching and examining the source of every photo—and still not being sure if the images are from someone who is who they say they are, and is giving consent for the photo to be online, let alone to be included in a dataset to train a computer to churn out uncanny-valley algorithmic women.
At best, TNDNE says it guarantees that every woman in the set is uniquely not a woman who exists. At worst, we’re continuing down the seemingly-limitless path of men experimenting on women’s bodies as startup fodder.
These nudes may not exist, but I’m still not sure why this startup needs to exist either. More algorithmic humans populating the internet can’t solve issues of diversity or non-consensual porn. If anything, it falls into the same criticism that Melody, the 3D-generated hentai cam girl, does: If you want nudes, why not pay for a quality custom photo or clip from a real, human model.
Update 8:55 a.m.: While thesenudesdonotexist.com was functionally generating different women as of writing, as of publication, the site doesn’t generate more than one person per session for purchase.