Over the past 24 hours, an outpouring of viral news clips, YouTube videos, and social media posts have claimed that Osama bin Laden’s 2002 “Letter to America” is going viral with America’s youth. And the blame is squarely being laid on TikTok, which is owned by Chinese company ByteDance. “This is an onslaught, this is all very intentional, and this is designed to destroy our country from within, and they are winning,” said Fox News contributor Charlie Hurt.
It seems, on its surface, like a shocking true story that Says Something About Society Today. The conversation now goes like this: Has the education system failed? Obviously. Is China trying to undermine America? Goes without saying. Should the U.S. government unilaterally ban the most popular social media app in the country, where millions of people say and do just as many diverse things every day—some of them unwise—in the name of national security? Well, if they don’t, then terrorism wins. In a seeming panic, The Guardian deleted its version of the letter, which had been hosted online by the news organization for 20 years.
Videos by VICE
There’s just one problem with all of this: the trend didn’t actually go viral, but the commentary claiming that certainly did. The extreme response to this incident and others—such as claims that TikTok is intentionally promoting pro-Palestinian sentiments—shows that we’re now truly in the fevered depths of a TikTok moral panic, based on a century-old and long debunked theory about how people interact with media mixed with political opportunism. We desperately need a balm.
Early reports, such as by 404Media and the New Yorker, indicated that the actual number of videos on TikTok discussing bin Laden’s letter was quite small. The U.K.-based Institute for Strategic Dialogue (ISD) published an analysis this week that said the group’s researchers had found 41 “Letter to America” videos on TikTok, which collectively gained just under 7 million views. This is an unimpressive number of views for even a single viral TikTok video, let alone dozens. A viral TikTok from this week of a travel mug allegedly surviving a car fire, which many people reading this likely haven’t seen, has over 30 million views.
A hurricane of discourse nonetheless followed, and spread more widely on other platforms. Posts mentioning bin Laden gained 719 million impressions on X (formerly Twitter) and the full text of the letter was found in posts there as well as Facebook. The ISD wrote that ultimately “researchers found that the platforms were ill equipped to deal with a surge of content violating their own policies.”
The panic over the bin Laden letter closely aligns with the furor over how pro-Palestine content has spread on TikTok in recent weeks. Content that supports Palestine is far more popular on the platform, compared to videos that support Israel. In similar fashion to this week’s fracas, pundits and various groups have claimed that TikTok is purposefully spreading pro-Palestinian content in order to undermine the U.S. government, which unequivocally supports Israel. Calls to ban TikTok have been made as a result, including by U.S. lawmakers. Republican Senator Josh Hawley referred to “TikTok’s power to radically distort the world picture that America’s young people encounter.” Republican Rep. Mike Gallagher claimed in an op-ed that China was “brainwashing” Gen Z.
Common to all of these claims is a reliance on a century-old media theory called the Hypodermic Needle Theory. This theory gained traction in the late 1920s and through the 1930s, and claimed that propaganda and media in general is an all-powerful force that can simply inject a passive audience with a message and have them act on it. This may well have been appealing around the time when mass media was first entering the scene—the first commercial radio broadcast occurred in 1920—but by the 1940s it was already debunked by other social scientists who conducted studies and realized that, actually, different people have different reactions to mass media, even propaganda. Often, it’s simply not effective. This is really basic stuff—I learned about it in a first year media studies course in university.
Still, we can see echoes of the Hypodermic Needle Theory in claims that effectively say China is beaming terrorism directly into the brains of America’s newly zombified youth. As a serious idea it’s not up to the task, but when paired with political opportunism to take another shot at America’s supposed enemies or shore up its geopolitical interests, it can still be a decent enough cudgel.
TikTok itself has taken strides to try and debunk the recent claims, and indirectly challenge this outdated idea of how people consume media. In a blog post this week, the company stated that “Attitudes among young people skewed toward Palestine long before TikTok existed,” and pointed to polls that it said proved its point (the data for Gen Z is less clear than that for millennials). It also stated that its algorithm does not push one side or another, but rather is designed to promote content that people are already engaging with. Moreover, it has millions of users in the Middle East and South East Asia, which the company said accounted for a large number of views on pro-Palestinian hashtags. After the claims that “Letter to America” was going viral on its platform, TikTok pointed out that the actual number of videos was small and that discussion of the letter was not unique to its platform.
Regardless, TikTok began heavy-handedly moderating and censoring videos that mention bin Laden or the letter in any context, whether or not they’re praising it or simply discussing the supposed trend. And that is all we seem to have gotten from this recent panic: Censorship. No greater understanding of America’s youth, China, Israel-Palestine, TikTok, or the internet. Just censorship.
This does not mean that there aren’t issues worth discussing and addressing with regards to social media platform’s algorithms, and what kinds of content they reward. Companies profit from the content that goes viral on their platforms, and the public should rightly scrutinize those mechanisms. Indeed, companies have responded to this scrutiny with internal systems to de-amplify or demonetize certain content, such as extremism.
But engaging with media is never a one-way street—or a hypodermic needle. Pervasive moderation systems have resulted in YouTubers and TikTokers speaking in their own bespoke language designed to evade automated, keyword-based tools. And with or without algorithms, people will hold their beliefs and promote them with whatever methods are available. Some of those views may be odious, even dangerous. They may be liberatory and good. They may be earnest, or disingenuous. They may have come from a book, a friend, or a video on the internet. They may be held by a small number of fanatics, or the majority of a population. But they exist, and an algorithm didn’t create them.