TikTok is one of the most popular iOS apps in the world right now. Hundreds of millions of users, many of them teenagers or children in the United States, use it to upload and browse lip sync videos and memes. Tiktok has repeatedly been held up as a space somehow free from the vitriol and danger of other social media networks. This week, the New York Times described it as “the only good app,” it was named as one of the Google Play Store’s most “fun” apps, and the Verge recently called it “joyful.”
But Motherboard has found a vibrant community of users on TikTok who appear to be soliciting explicit images of boys and girls, and some young users have complained on the platform about other people repeatedly asking them for nudes.
Videos by VICE
“(guy) looking for vids and nudes, and add me to groups I share vids,” the bio of one TikTok user found by Motherboard reads.
Chinese-made TikTok lets users record a video of themselves lip syncing along to music or movies. The app itself works in a similar way to Instagram, or the late Vine, with a timeline that users flick through, a search function, and a list of popular hashtags that users can follow.
In some videos posted to the platform and seen by Motherboard, users who appear to be children mention that people continually ask them for nude images. The creator of one of those videos imply they are 13 years old.
In a statement, a TikTok spokesperson told Motherboard “such behavior is not only abhorrent, it is prohibited on TikTok. We have a number of measures in place today to protect against misuse. TikTok doesn’t permit images or videos to be sent in comments or messages, and users can make their account private, block another user, report an account or content, and disable the ability to receive messages. In addition, our moderation team removes inappropriate content and terminates accounts that violate our Terms of Service and Community Guidelines.”
Got a tip? You can contact Joseph Cox securely on Signal on +44 20 8133 5190, OTR chat on jfcox@jabber.ccc.de, or email joseph.cox@vice.com.
While there is no content on a number of hashtags that have been associated with nude trading and images of trading on other social media platforms, suggesting TikTok may be curbing their use, the company has failed to moderate others. One of those, ‘#tradefortrade’, led to a profile with a male stroking themselves suggestively. A comment on that post led to another profile, which appears to be focused primarily on trading nude images, judging by its username. One of that user’s videos asks for nudes, and also includes audio of what appears to be someone masturbating. In turn, the accounts this user follows reveals a larger community of accounts centred around sharing nude images.
“Girl trade pics (boy or girl) dm me 🍆 💦” the bio of one profile reads.
Because TikTok does not allow users to send videos or images via direct messaging, so people soliciting nudes will likely move onto another app after making contact.
“If you have a 🍆 trade with me on snap chat dm me and i go live everyday,” the bio of another account named “lets trade boys” reads.
These accounts are not isolated. Some have hundreds of fans, TikTok’s equivalent of followers. Some of the accounts that explicitly ask for nude images also follow a slew of normal accounts that appear to be run by young children or teenagers.
It is not possible to tell if this accounts are run by underage users or adults. Other usernames imply the users could be of a young age, such as “16tradehmu,” but Motherboard was unable to confirm this—it could be someone pretending to be younger in order to solicit images. Because of the possibility that some of these people could be sharing illegal images of child pornography with other users, Motherboard did not engage these users.
Technically, TikTok does not allow people under 13 years of age to use its platform, according to the company’s Terms of Service, but many users in videos are clearly younger.
TikTok was previously known as Music.ly. In June of last year, a Frenso man was charged with sexual exploitation of children through Music.ly and a number of other apps.
Motherboard also found examples of doxing and harassment of high profile users who use the platform.
Recently ByteDance, the company that owns TikTok, said it would increase the number of its content moderators to 10,000, from 6,000.