An investigation by VICE News has uncovered a coordinated campaign to pay Russian TikTok influencers to post videos pushing pro-Kremlin narratives about the war in Ukraine.
Numerous campaigns have been coordinated in a secret Telegram channel that directs these influencers on what to say, where to capture videos, what hashtags to use, and when exactly to post the video.
Videos by VICE
These campaigns were launched at the beginning of the invasion and have involved a number of the highest-profile influencers on TikTok, some of whom have over a million followers.
And even though TikTok has banned new uploads from users located inside Russia, the campaigns have not stopped. The Telegram channel is run by an anonymous administrator who recruits social media influencers and told VICE News he was a journalist. The administrator lays out the requirements, such as minimum views required and the date and time the video needs to be posted. He also asks potential recruits to say how much money they demand per post. It remains unknown who is paying for the campaigns.
On Wednesday evening, the administrator advertised a new campaign seeking TikTok users to post videos calling for national unity, using an audio track featuring Putin calling for all ethnic groups in Russia to unite at this time of conflict.
The message on Telegram told creators what audio track to use, which emojis to use, and what text to post on their videos.
The channel’s administrator also gave potential contributors a step-by-step guide on how to circumvent TikTok’s ban on uploads from Russian accounts.
The details about the campaign, as well as all other content from the channel, were deleted Wednesday night after VICE News contacted the administrator. However, the job was advertised long enough for a TikTok influencer called d00zenn, who has 480,000 followers, to post a video that matched the requirements perfectly. It’s unclear how much, if anything, d00zenn was paid for taking part in the campaign and he didn’t respond to VICE News’ request for comment.
The Telegram channel that was coordinating the campaigns was set up late last year, and had amassed over 500 members before it was abruptly shut down on Wednesday.
The channel’s administrator has, in recent months, sought TikTok and Instagram influencers to star in campaigns for betting companies, smoking rooms, and student financial assistance companies, according to a review of the jobs posted to the account.
There have also been a number of campaigns that appear to align with Russian government objectives, including campaigns supporting COVID-19 vaccines, the Russian economy, and Russian Winter Olympians.
But in recent weeks, the channel has offered several campaigns designed to promote a pro-Russian message about Putin’s invasion of Ukraine.
One of these campaigns, first spotted by Ukrainian photographer Christina Magonova and documented in a video posted on her Instagram channel, featured a series of prominent influencers all reading the same script. The script attempts to excuse the war in Ukraine by promoting the falsehood that Ukraine perpetrated a genocide against Russian speaking in the Donbas region over the last eight years.
This aligns with the narrative being pushed by the Kremlin that Kyiv was carrying out a “genocide” against the Russian-speaking population in Donbas, which Putin used as a justification for the invasion of Ukraine.
When contacted about her video, Magonova spoke briefly to VICE News via online messaging, but said that she was on the Ukrainian border, having just left her home in Dnipro “because Ukraine is not safe anymore” and she would no longer have internet access.
The campaign used hashtags like “Come on World” and “We don’t abandon our people” and among the influencers who took part in the campaign were Viktoriya Fomina (1.8 million followers), Fentazi90 (1.2 million followers), and Roldozzer and Kirill Felix (1.4 million followers each). Again it is unclear how much, if anything, these creators were paid and none of them responded to VICE News’ request for comment via email and their social media accounts.
Most of the videos have now been taken down—something the people orchestrating the campaign asked the influencers to do—but many still remain on the platform and have racked up hundreds of thousands of views.
Another campaign saw TikTok stars asked to post videos using a “mirror reflection” effect, where one side is labeled as Russia and the other as Donbas. The videos are set to the soundtrack of a song called “Brother for Brother.” The directions posted in the Telegram channel told the influencers to beat their chests with their fists and lipsync with the line from the song: “We don’t leave our own.”
As Russia has struggled to win the information war against Ukraine’s social media-savvy President Volodymyr Zelenskyy, it has turned to new avenues to disseminate its disinformation, including pretending to be a left-leaning progressive media outlet based in Germany, producing fake fact-checks of non-existent Ukrainian disinformation, and coordinated TikTok campaigns.
But it is impossible to tell if the campaign VICE News discovered is the tip of the iceberg, or a one-off. The scale and the success of any disinformation campaign being waged on TikTok by the Kremlin—or anyone else—is unknown, because the company will not give journalists or researchers the type of access they need in order to fully assess how widespread the problem is.
“Doing such research would be quite difficult on TikTok as such research is often based on identifying patterns of behavior among large clusters of accounts, as opposed to standalone accounts, and this would only really be feasible or scalable with an [application programming interface],” Ciarán O’Connor, a disinformation researcher at the Institute for Strategic Dialogue told VICE News. “Compared to other social media platforms, TikTok’s API does not offer researchers much assistance or capability.”
While TikTok makes it very difficult to track coordinated campaigns, one way this can be done is by seeing all the videos that have used the same song. Days after the “Brother for Brother” campaign launched, over 1,000 videos featuring the song appeared on TikTok, according to a post on Russian blogging site TJournal.
TikTok did not comment on the coordinated campaigns on its platform when asked by VICE News, and because of a lack of transparency on the network, it’s difficult to assess how successful these campaigns were.
But it’s clear that the campaigns were unsophisticated compared to the types of campaigns we’ve seen on Twitter and Facebook in recent years.
“If it’s an organized disinfo campaign, it’s a bad one,” Abbie Richards, a misinformation researcher who studies in TikTok told VICE News. “Anyone with any experience should know better than to give a bunch of people with a lot of followers the exact same script, you’re gonna have overlap.”
And the campaign has been noticed among other Russian TikTok users, a number of whom have posted videos using the same hashtags or audio tracks to criticize the influencers who they believe have sold out. One Russian TikTok user blasted those who posted the videos, saying: “These people, taking into account the rise in prices, sold themselves for a loaf of bread.”
The amount of money being paid to the influencers is not revealed in the Telegram channel, and in fact, when applying to take part in a campaign, users are asked to name their price, and only the “most profitable” are picked.
Estimates from other Russian influencers estimate payment to be anywhere from 2,000 rubles to 20,000 rubles, which at current prices equates to as little as $17 per post. “As far as I know, they paid a little, up to 20,000 rubles,” one Russian TikTok creator whose identity is not being published to protect them, told VICE News. “The condition was to quickly post the video in one day. This task was thrown into the group in the Telegram.”
The anonymous administrator of the channel denied to VICE News that they had any link to the channel, even though their Telegram username was repeatedly referenced in posts there. They asked how VICE News got access to the channel, and when VICE News would not reveal the source, the administrator stopped communicating and deleted all content from the channel.
TikTok’s community guidelines specifically say that users should not “engage in coordinated inauthentic behavior such as the use of multiple accounts to exert influence and sway public opinion while misleading individuals, our community, or our systems about the account’s identity, location, relationships, popularity, or purpose.”
When asked specific questions about the campaigns above, TikTok did not respond, but sent a boilerplate statement about how the platform has continued to “respond to the war in Ukraine with increased safety and security resources to detect emerging threats and remove harmful misinformation.”
The company also did not respond to questions about how much war-related content it had removed since Russia invaded Ukraine on Feb. 24.
But it’s clear, both from the huge amount of misinformation circulating on the platform—some of it with millions of views—and the lack of transparency about its moderation process, that TikTok was caught completely off guard by Russia’s war on Ukraine.
What the company has done is finally got around to labeling some of the prominent Russian-state media accounts, though those labels only show up on the mobile app, not on the desktop. TikTok told VICE News it would be rolling out those labels on desktop shortly.
But research O’Connor conducted into the labels on those accounts found that there were large gaps in TikTok’s labeling, including the failure to label the editor in chief of Russia’s state-linked news broadcaster RT. O’Connor found that 12 videos Margarita Simonyan posted, in which she promoted Kremlin propaganda and claimed Ukraine was the aggressor, were viewed 21.3 million times as of March 8.
The Wall Street Journal reported Wednesday that content moderators at TikTok were left without detailed instructions on how to deal with war-related content, and low-level managers had to improvise on the fly. The result was that similar content was treated differently depending on which content moderator reviewed it.
And TikTok’s design leaves it open to unique types of misinformation being shared widely on the app. According to Richards’ own research, users have been using audio from old events and matching it with video footage from Ukraine to create a hybrid that tricks viewers into thinking it’s real footage from the current war.
One video matched shaky footage of someone running away from their balcony with audio from the 2020 Beirut port explosion and presented it as something that had just happened in Ukraine. It racked up almost six million views in 12 hours and TikTok did nothing to stop it from going viral. The video has now been removed but it’s unclear if the creator deleted it or if TikTok took it down.
Another inherent problem for the platform, and for the researchers trying to track disinformation spreading across it, is that many of the accounts posting footage about the war are completely anonymous.
“There’s no finding out who’s behind that and all you can do is guess their intent,” Richards said. “Any information about the account is completely up to them as far as what they volunteer and so much of these videos are just coming from anonymous accounts.”
TikTok did not respond to a question about whether it was planning to open up its API, as Twitter and Facebook have, to allow researchers to track larger groups of accounts to spot coordinated campaigns.
As a result, researchers focusing on TikTok are effectively working in the dark, making discoveries by manually scrolling through videos, trying to track trending topics, and grinding through hundreds of videos to try and spot patterns on the platform.
“To some extent, TikTok is playing catch-up. Facebook and Twitter have had far more years of hosting political content or seeing conflict in other countries but at this point, [TikTok] is a huge, huge company that’s worth a lot of money and they could absolutely be doing a better job,” Richards said. “The fact that it’s getting this out of hand … speaks to the fact that they aren’t doing a good enough job of containing it. They have a lot of room to improve.”
Greg Walters contributed to this report.
Want the best of VICE News straight to your inbox? Sign up here.