Apple’s flawed adult content filters are blocking innocuous sites while letting actual pornography through.
iPhone and Mac users have discovered that a feature allowing parents to protect their children from adult content behaves erratically, despite years of repeated complaints to Apple.
Videos by VICE
A Twitter user who goes by Steven Shen found out that sites featuring the word “Asian” are inexplicably banned when he turned on his iPhone’s adult content filters.
VICE World News tested a bunch of other words and discovered frequent misfiring of the filtering feature—and an apparent pattern.
The software tends to block sites and searches featuring words including “ebony,” “daddy,” “massage,” “babe,” “hardcore,” “teen,” and “amateur”— innocent terms that take on new meanings only in the context of pornography. They are some of PornHub’s most searched-for terms.
At the same time, the filter fails to catch some of the most popular porn sites, including all adult Reddit sections and popular regional porn sites like ThisAV, which is frequented by Japanese, Taiwanese, and Hong Kong users, according to web ranking service Alexa.
OnlyFans, a social media network that allows people to sell photos, often nudes of themselves, also slips through, although it requires a subscription to view content.
Some words that are more closely associated with sex are not restricted, such as “sadomasochism,” “threesome,” “jerk off,” and “wank.” And Apple’s interpretation of “adult content” doesn’t seem to include violent imagery. Searches for “how to make a bomb,” “snuff film,” and “murder footage” are all unrestricted under the content restriction mode on iPhones and Macs.
The flaws suggest that Apple is not applying the same technology on parental controls that they use in many other applications. Such technology, often based on artificial intelligence and machine learning, allows a program to detect objects in images and analyze language. Companies such as Google and Facebook use similar tools to block nudity on their sites.
Apple did not respond to requests for comment.
Li Xiaofan, assistant professor of information systems and analytics at the National University of Singapore, told VICE World News that Apple’s filtering method is ineffectual and unsophisticated.
“My guess is that they trained an algorithm by providing it samples of pages [that] should be censored and samples of pages [that] should not be censored. Then the algorithm learned the keywords associated with pages [that] should be censored. However, the sample size may not be large or representative enough and the algorithm itself could be ill-designed,” Li said.
The feature has not noticeably improved since users started complaining about it as early as 2014. The sex education site O.school in 2018 reported on its inconsistency and provided a number of glaring omissions, but those sites are still either incorrectly blocked or let through.
Chris McKenna, founder of the internet safety advocacy group Protect Young Eyes, said that the Apple feature is sorely lacking.
“I would say that most parental control solutions have certain things that they struggle with,” McKenna told VICE World News. “And iOS is the perfect example where Apple has essentially made it difficult for parental control solutions to do the job that parents want them to do.”
But McKenna said that no parental control features can replace actual parenting.
“There is a definite responsibility that parents have to protect their children online. But this is a shared responsibility with organizations that have endless resources and are not fulfilling their half of this bargain,” he said.
“All children need to know that they can land safely and softly with you as a parent. And then you use parental controls in full transparency and honesty with your kids… That digital trust gets built when you have that good balance between relational and technical solutions,” he said.