News

Viral Vigilantes or Snitches? 2021 Was the Year of Mob Moderation on TikTok

Failings in justice systems and online moderation have allowed content creators to take things into their own hands in front of millions. And sometimes they get things wrong.
Viral Vigilantes or Snitches? 2021 Was the Year of Mob Moderation on TikTok
Photo: Stock image

When Danesh, who lives on the East Coast, came across a video on TikTok of a woman joking about hitting her children with a belt, he had never made a video tracking someone down online before – in fact all of his videos up to this point were about politics. 

“It was so absurd it was doing so well,” he recalls. “I looked her up, and I called in child protective services. And I thought: ‘I’ll make a TikTok about it’”. 

Advertisement

Since then, Danesh, or @thatdaneshguy, has gained hundreds of thousands of new followers by what he says is holding people online accountable for their behaviour, and then TikTokking about it, usually revealing or highlighting the offending user’s identity. His videos have made the news, like when he called out of a Georgia police officer who joked about refusing to help a prisoner give birth. And Danesh’s content spurs other creators on – some of whom praise him wholeheartedly, and some of whom tell him he is a doxxer, a threat and, most commonly of all, a snitch. 

His origin story is not unique. The Great Londini, who brands his movement as “the protectors of the innocent; a positive movement seeking to end racism, bullying, and scamming on social media,” began his account when a friend asked for help tracking down his son’s bully, after the 14-year-old died by suicide. He also has as many critics as he does fans; in fact neither Danesh or The Great Londini share their full names online, because both receive regular death threats. 

Advertisement

But what is unique is how popular these accounts have become, how loyal their followings are, and how they have normalised sharing the identity of people they disagree with online with the expectation of retribution for their wrongdoing. This normalisation has come at a cost; with its increase in popularity has come an increase in the risk of misidentifying someone – as has the chances of bad actors using the same open source skillset to mobilise against their targets. One person’s doxxer is another person’s freedom fighter, moderator, and even hero. 

TikTok’s own community guidelines seem clear, at first glance: “We define doxxing as the act of collecting and publishing personal data or personally identifiable information (PII) for malicious purposes. We consider these online behaviors as forms of abuse and do not allow them on our platform.” But that ‘malicious’ bit jars with many creators, who don’t see sharing personal info as a malevolent act, but a judicial one, holding a perpetrator to account.    

These accounts really are exploding in popularity:  Danesh only had 80,000 followers at the beginning of the year, and is now about to hit 700,000. The Great Londini had already reached 2 million followers this summer, after repeated bans over varying guideline violations including harassment in which he lost hundreds of thousands of followers, and now he’s at 6.9 million. Other creators in this space – mostly Americans – have millions of followers too: @tizzyent who recently called out a woman for featuring a Nazi flag in a TikTok has 3.9 million, and @auntkaren0, who uses her account to identify racists, posting publicly available information about them, has 1.5 million followers.

Advertisement

The format of their videos is generally similar, in which creators react to videos they deem unacceptable that they’ve either encountered on TikTok itself, or on another platform. 

They’ll then share a process with their followers in which they search for the individual’s identity by sourcing their publicly available information, and attempt to seek out some kind of justice – which can vary wildly from intimidating the user into not harming someone again, to getting them punished at their school or fired from their work. 

In several instances, the offender at first appears to be anonymous. But through basic open source investigative skills, the creators showcase how easy it often is to find such users online.

That’s why, of all the vigilante videos you might see, there’s an awfully high number of videos you also never see, because of all the trails that have run dry; all of the people who are truly anonymous online, who people like Danesh fail to find and so therefore never end up making a video about. 

Dominic DiFranzo, an assistant professor at Lehigh University in Pennsylvania, sees TikTok’s vigilante videos as the latest iteration of an old phenomenon. “Back in the early 2000s you had Chinese citizens doing online investigations, on Reddit you’ve had different types of witch hunts. [There was] Gamergate on Twitter ten years ago, when a lot of angry video game enthusiasts under the guise of journalism and integrity created entire hate campaigns against people.

Advertisement

“TikTok now is finding themselves inside of it. Another generation finding the same tools. It’s about how you organise collective outrage and upset. There is a certain enjoyment you get from seeing someone get brought down a peg. If you don’t have other means to hold them accountable, like legal channels, this is a mechanism by which you can do that.” 

Cybersecurity expert Julia Slupska agrees that these TikTok creators are the latest in a long line of internet sleuths, but caveats that TikTok has turned it into “a spectator sport.”

“There’s a lot of community moderation at subreddit level but where is it on TikTok?” they ask, although maybe these TikTok creators are TikTok’s community moderators; the difference being that while the Redditors are often faceless and working tirelessly behind the scenes, TikTokkers record their processes for public consumption and seek to become recognisable faces. 

“Any time that the authoritative way for resolving conflict isn’t functioning, people will find alternative ways to do it,” Slupska said. “One example that came to mind is a research project which created a tool for outing people doing sexual harassment on Facebook messenger, looking at shaming as a mechanism.”

The experiment found that this shame-based model of gender justice often left the victim and supporters “entangled with mob-sentiment.” Slupska said: “Justice is subjective. Vigilante justice always has the risk that people take things too far. There’s been cases of people mobilising online in India to track down child sex abusers, but they’ve also channeled Islamophobia or will prejudice against different groups.” 

Advertisement

This is true of TikTok’s movement, too. Several creators in this space will be criticised - sometimes simply for the act of snitching alone, or because they have misidentified somebody or instigated an internet pile-on when perhaps all the facts weren’t straight. 

Danesh thinks that his possible role in community governance could stop immediately if he got something wrong. “If I say or do something wrong or overstep, my community isn’t necessarily going to have my back on that. Neither are other accounts that do this sort of thing. We aren’t modding one particular sub – when people just see something that’s unjust, they want to do something about it.”

TikTok Has an Incel Problem

Researchers have for years already understood that communities build themselves around shared social norms, in which retributive harassment may be considered one such norm. In one 2018 paper, researchers found found that “a danger of retributive harassment, and its widespread use, is that marginalised voices will be silenced while socially dominant perspectives are amplified,” and suggested restorative rather than retributive justice which offers voice to both victim and offender would introduce “mediation, reconciliation, and proportionality” to promote “civil and inclusive participation online by enabling reconciliation at scale.” 

Advertisement

Creators like Danesh, who use humour to deride and online followings to mass report or seek justice, don’t necessarily leave much room for reconciliation. 

Danesh agrees with Slupska that creators like him all come from their own political values, and that this affects who they crack down on. “You can try to make things as non-political as possible, but it does spill in.” Those he frequently targets seem to sit on the conservative side of the spectrum, especially anti-vaxxers, just as those who The Great Londini targets often have specifically abused veterans or soldiers, The Great Londini being a veteran himself. This led to his movement steaming after one anti-military user in particular, because The Great Londini had misidentified her as being employed by an organisation working with veterans – subjecting her to a bombardment of online harassment that she says profoundly affected her happiness. His followers had found her former employer online, mistaking them for her current one, and tried to get her fired from it.

Apart from the risk of misidentification, these same open source investigate skillsets that clearly afford TikTok creators viral fame has allowed harassers to weaponise these same tools to do their own version of moderating. 

As VICE World News uncovered last month, misogynists went viral with the #cleaningthestreets trend, in which they sought to rid the app of women they deemed ‘hoes’ and screen recorded themselves sending women’s videos to their parents, scored millions of views before some of the most popular videos were taken down by TikTok. 

That’s why DiFranzo is wary of two things he sees as likely to be more common as online hate campaigns become more normalised. One is that ordinary people rather than public figures will be targeted more often, who do not have the publicists or profile to necessarily maintain lines of income or indeed sanity throughout the experience. “We start to not see the difference between being mad at JK Rowling for her views and the person down the street who’s said something, and we’ve condensed an entire person’s life and essence into 280 characters that they wrote one time.” 

Viral videos like those on TikTok are created quickly and simply with it’s in-built editing tools is creating video producers faster than any other app before it. It’s not just that it’s an unpredictable, volatile algorithm – it’s that literally anyone can make something good for that algorithm in the first place. 

Then, they are watermarked with the TikTok logo and reshared everywhere else on the internet, taking the content elsewhere; content that can target, and tarnish, in one fell swoop.

DiFranzo warns: “I can see more cross platform campaigns. I hope we never see giant Gamergates. But I see it being used and weaponised by those who have power, or want power. 

They’ve seen how to tap into these things for political purposes and that isn’t going away. There are incredible connections – these type of campaigns allow communities that are possibly very separated to unite around hate.”