TikTok, like most online platforms, has long had issues with content moderation. From violent extremism, to hate speech, to deep fakes that pose prominent figures in defaming positions, it hasn’t been all sunshine and roses for the 21st century’s latest major social network behemoth.
In 2020, the platform fell into hot water after their attempts at moderating against cyberbullying caused LGBTQ+ content to be briefly restricted. Worse still, reporting from The Intercept found that videos featuring people considered “ugly”, “poor” and disabled were filtered out by human moderators in order to create an “aspirational” feel on the For You page.
Videos by VICE
Now, TikTok is facing its latest moderation challenge: anonymous accounts that promise easy-access to disposable vapes, online, as a “black market” on the platform burgeons.
On TikTok, accounts using the names of prominent manufacturers like Shenzhen Hanqingda Technology Co – which produces the branded vape “HQD” – can be seen uploading videos shot in factories; inserting flavourings into tubes, installing batteries, and sticking on branded labels. On one profile, links to external websites that allow users to bulk buy up to 300 vapes at a time, promising to ship anywhere in the world, can be found in video descriptions.
Despite TikTok’s drug policy, which bans the depiction, promotion or trade of drugs and other controlled substances on the platform, the social media giant is struggling under the weight of it. The various accounts have hefty followings in the hundreds of thousands and often go viral. Multiple back-up accounts exist, too, so that when one is taken down the others can live on.
Though it’s hard to whittle down the actual number of videos that may exist in relation to vape-advertising (as searching any term related to “vape” is now banned), VICE counted dozens of HDQ-branded sub-accounts with followings from zero to the hundreds of thousands.
According to experts, part of the reason TikTok struggles to curb the promotion of vapes on its platform is that it relies so heavily on users reporting the content of others instead of taking a proactive approach themselves.
“TikTok emphasises automated methods, stating [in their policy] that it then ‘enables our team to focus more time on reviewing contextual or nuanced content, such as hate speech, bullying and harassment, and misinformation’,” Andrew Childs, a criminology and criminal justice lecturer at Griffith University, told VICE.
On top of that, users can easily work around the platform’s policy by disguising their content as educational, and omit text from videos that may point to direct advertising.
“When we’re discussing whether or not content breaches community guidelines it’s also important to recognise that some content will forever exist in an ambiguous ‘grey-zone’,” said Childs.
“When thinking about vaping content on TikTok, for instance, there is a wide variety of content including #vapetricks, where users demonstrate a smoking trick. Some content creators might also use #vape because it is part of a skit.”
Disposable vapes themselves have been banned in Australia since October 1, 2021, after they were classified as a schedule 7 Dangerous Poison under the National Poisons Standard. Despite a thriving market in corner stores across the country, it is technically illegal for vapes to be imported, bought or sold without a doctor’s prescription.
While police have signalled a crackdown on the illegal sale of vapes by Australian retailers in the last year, platforms like TikTok and Instagram – each largely dependent on AI and machine learning to find breaches of its terms of use – have offered themselves as fertile ground for a number of black markets.
“Social media by its nature involves huge amounts of user-generated content,” James Martin, a senior criminology lecturer at Deakin University, told VICE.
“While there may be a human moderator that can review everything that’s posted, you rely increasingly on AI and those automated technologies to detect content that may violate those policies and of course those tools aren’t foolproof.”
Martin points to the growing savviness of users who are constantly navigating the platforms with “work-arounds” as a significant barrier.
“It’s a constant game of cat and mouse,” he said.
Similar issues have been found in the US and UK markets. For an international platform like TikTok, adopted by young people around the world, this isn’t surprising. In the US, the Food And Drug Administration (FDA) cracked down on the illegal sale of e-cigarettes online, specifically Juul’s, in 2020. But other manufacturers, like the PuffBar, have been large-scale adopted in their place.
It’s these workarounds that, Childs says, will make it hard for platforms to ever successfully moderate their content.
“Every technology platform struggles in this regard, and often there are really only Band-Aid solutions applied,” he said.
“For example, in the case of this vape distributor’s account, it may well be the case that it is removed next week, but in the meantime, they’ve generated sales by advertising via TikTok (and if they’re successful vendors they’ll have repeat customers just go straight to the website), and they could easily create a new account again to gain followers”.
While TikTok’s policy is definitive when it comes to the depiction of vaping, a TikTok Spokesperson told VICE that they will continue to invest at scale to detect and remove content that violates their community guidelines.
Follow Julie Fenwick on Twitter and Instagram.
Read more from VICE Australia and subscribe to our weekly newsletter, This Week Online.