News

Man Arrested for Uncensoring Japanese Porn With AI in First Deepfake Case

Deepfake technology could practically reverse the pixelation in Japanese adult videos, raising legal and ethical questions.
deepfake, porn, Japan, actors, genitals, privacy, AI, technology, mosaic
The man reportedly made about $96,000 by selling over 10,000 manipulated videos.

Photo: Shutterstock

Japanese police on Monday arrested a 43-year-old man for using artificial intelligence to effectively unblur pixelated porn videos, in the first criminal case in the country involving the exploitative use of the powerful technology.

Masayuki Nakamoto, who runs his own website in the southern prefecture of Hyogo, lifted images of porn stars from Japanese adult videos and doctored them with the same method used to create realistic face swaps in deepfake videos. 

Advertisement

But instead of changing faces, Nakamoto used machine learning software to reconstruct the blurred parts of the video based on a large set of uncensored nudes and sold the content online. Penises and vaginas are pixelated in Japanese porn because an obscenity law forbids the explicit depictions of genitalia.

Nakamoto reportedly made about 11 million yen ($96,000) by selling over 10,000 manipulated videos, though he was arrested specifically for selling 10 fake photos at about 2,300 yen ($20) each.

Nakamoto pleaded guilty to charges of copyright violation and displaying obscene images and said he did it for money, according to NHK. He was caught when police conducted a “cyber patrol,” the Japanese broadcaster reported.

Photo-realistic images created using AI are increasingly common and have raised many legal and ethical questions concerning privacy, sexual exploitation, copyright, and artistic expression.

“This is the first case in Japan where police have caught an AI user,” Daisuke Sueyoshi, a lawyer who’s tried cybercrime cases, told VICE World News. “At the moment, there’s no law criminalizing the use of AI to make such images.”

For example, Nakamoto was not charged with any offenses for violating the privacy of the actors in the videos.

Globally, victims of doctored videos, often women, and governments are grappling with a proliferation of deepfakes. In Taiwan, a man was arrested also on Monday for selling deepfake porn on a Telegram group with some 6,000 members. Taiwanese President Tsai Ing-wen called the crime “online sex violence” and said she would consider legislation against it.

Advertisement

“These victims could be people you and I care about. They could be our families and friends. We can’t sit on the sidelines,” Tsai said in a Facebook post.

Tsai also linked the use of the technology to the threat posed by fake videos and fake information to democracy. 

The potential of using deepfakes to sow mistrust and manipulate public opinion was demonstrated as early as 2018, when a viral video showed former U.S. President Barack Obama calling his successor Donald Trump a “total and complete dipshit.” 

The next year, California banned political deepfakes within 60 days of an election to combat potential campaign misinformation.

So far, as in Nakamoto’s case, deepfake technology has been used overwhelmingly to simulate pornographic videos.

According to Sensity, a startup that offers fake video detection services, 96 percent of deepfake videos depicted nonconsensual pornography in 2019. In fact, the Reddit users credited with propelling deepfakes into the mainstream used this technology to swap female celebrities’ faces into porn videos. Such use has led to numerous cases of victims fighting to remove compromising fake videos from the internet.

In India, a gang allegedly blackmailed people by threatening to send deepfake videos of them to their families. 

Sueyoshi said criminalizing the use of deepfake software or similar technology is not the right answer to the problem, as the tools themselves could be used for legitimate purposes.

“Using AI to lift mosaics isn’t what’s wrong. It’s how the suspect Nakamoto used AI,” he said. 

But given cases of copyright infringement or exploitation of a person’s privacy, Sueyoshi said it was necessary to introduce laws that restrict how, not whether, deepfake technology is used. 

Follow Hanako Montgomery on Twitter and Instagram.