Life

YouTube is Working With Met Police to Take Down Rap and Drill Videos

The YouTube logo on a phone.

For London’s rap and drill artists, releasing music is something that happens under the watchful eye of the Metropolitan Police

According to new data obtained via Freedom of Information (FOI) laws, the London-based force referred 510 music videos to be taken down from YouTube in 2021. In 96.7 percent of cases, the clips were removed. 

Videos by VICE

This marks a significant increase on previous years. In 2020, there were 125 referrals made, resulting in 124 removals; a year prior to that, 110 videos were referred and 107 removed. The 2021 figures mark a year-on-year increase of almost 300 percent. (A separate report from the New York Times puts the 2020 removal figure at 319, which still presents a significant 60 percent year-on-year increase.)

In the past, the punitive conditions imposed on artists like Skengdo & AM, who were given a suspended sentence for performing “Attempted 1.0” in 2018, and Digga D, who has to supply police with lyrics and visuals before putting out new material, were most likely to make headlines. 

But the FOI figures illustrate the Met doubling down on its fixation with linking rap videos and violent crime – and reveal the willingness of YouTube to work with the police in taking down videos. 

The Met has been actively going after drill rappers since September 2015, when it launched Operation Domain – the name given to the team responsible for monitoring YouTube for what the force described as “videos that incite violence”.

The relationship between the police and YouTube is described by the Met as a “collaboration” and “enhanced partnership working”. Since 2018, when the Met began working directly with YouTube, the number of referrals and removals has increased sharply.

A YouTube spokesperson told VICE: “At YouTube we are deeply committed to helping music of all genres grow and thrive. While YouTube is a platform for free and creative expression, we strictly prohibit videos that are abusive or that promote violence. We work closely with organisations like the Metropolitan police and National Crime Agency to understand local context. We’re committed to continuing and improving our work on this issue to make sure YouTube is not a place for those who seek to do harm.”

The speed with which videos are removed – often within a matter of hours – suggests not only how closely online activity is being monitored, but how responsive YouTube is to the Met’s requests.

Operation Domain’s footprint increased just as the UK’s homegrown rap scenes cemented themselves on YouTube – a longtime incubator and distribution hub for UK rap artists – and in the charts too, where artists like Headie One, Tion Wayne, Digga D and Central Cee make regular appearances in the Top 10.

In June 2019, Operation Domain was folded into a new online surveillance initiative called Project Alpha. With £4.8m in funding from the Home Office to date, Project Alpha maintains and monitors a surveillance database spanning 34 different categories across a range of different social media platforms. Since 2020, 1,006 rap videos have been included in the database.

The Met says its Project Alpha officers have been given “trusted flagger” status by YouTube to “[ensure] that harmful material is removed quickly from the platform”.

These officers, the Met told VICE, are “street-wise” and “have previous experience of working in gang units across the capital” and possess “extensive insight into gangs, understand the slang and colloquial language used and can spot emerging threats”.

The force is less clear on the relationship between music and gangs, saying only that “the Project Alpha team continues to work to understand the reality of the links between online activity and ‘real world’ offline offending” and that “it does not seek to suppress freedom of expression through any kind of music”.

Woman walking into Newham Council building in London
In the run-up to the 2012 London Olympics, Newham Council ordered the removal of 76 videos from YouTube. Photo: Simon Dawson/Bloomberg via Getty Images

This isn’t the first time that London authorities have tried to conflate music with gang activity. In 2012, under a move called Operation New Hampshire, Newham Council ordered the removal of 76 videos from YouTube – including WoodGrange E7’s “Who’s That Click”, a flip of Eve’s “Who’s That Girl?” – in an effort to burnish the host borough’s reputation in the run up to the 2012 Olympics.

According to Newham Council, in a response obtained via Freedom of Information, there was no projected end point for the 2012 work. No metrics were used to evaluate the benefits of the project. “The successful outcome of removal of the videos is self-evident,” the council said at the time.

When a music video is taken down at the Met’s request, the artist in question typically has little say in what happens. According to music industry professionals familiar with the procedure, it goes something like this: the uploader, often a third-party channel, will receive an email from the Met informing them of their intention to refer a video to YouTube for takedown. The email will include some detail as to why the video is being flagged. At this point, the uploader will temporarily remove the video, by changing its listing to private.

The email is then forwarded to the artist or their management. They are free to contest the Met’s decision, but many say the situation is stacked against them. The police are unlikely to back down, and if the request is ignored, it can then be forwarded to YouTube – at which point a “strike” will be registered against the channel.

One strike incurs certain limits on the channel, such as barring new uploads or live streams for a week. A second strike within the same 90-day period sees those limits extended to two weeks. Three strikes in that period, and you’re out: YouTube deletes the channel. For uploaders with established audiences stretching into the millions, in some cases providing significant income, this is a serious threat.

What’s unclear is whether all uploaders receive direct correspondence from the Met, or whether YouTube’s strike notifications indicate that a video has been targeted at the force’s request.

Initially, YouTube would simply issue strikes, but this changed after channel owners lobbied the platform to alter its practices. Then the Met began providing YouTube with a list of videos to take down – the volume of which meant popular channels were constantly on the brink of a three-strike outage. Then it went back to the strike system, before channel owners pointed out to YouTube that the company was at risk of driving out some of its most popular users on behalf of the police. Now the Met sends emails to some channels – but not all of them, according to some artists who have simply received strike notices. 

Artists whose videos are taken down will sometimes respond by uploading more heavily censored versions, but this can be a case of guesswork. Plus, “by the time you re-upload it, it’s not going to have as much views or streams as the last time round,” says Toby Egekwu, better known as TK. He co-founded Finesse Foreva, the record label and management agency that represents Skengdo & AM, and works closely with young and unsigned drill and rap artists.

Drill musicians Skengdo & AM in grey suits at an industry event.
Finesse Foreva founders TK (right) and SK with Skengdo & AM (centre). Photo: Tabatha Fireman/Getty Images for YouTube

 These takedowns have a chilling effect on musical output, TK explains. Since Operation Domain activity ramped up in 2018, popular channels have responded by advising artists, managers and labels to obscure or censor references to names, places or content that could be deemed offensive.

Censorship, both self-imposed and otherwise, is having artistic echoes too, and is increasingly woven into the genre’s sonic landscape. Dense slang and sound effects, like the Sonic the Hedgehog rings on Peckham drill crew Zone 2’s “No Censor” or the skiddy vocal reverses that pepper tracks in PressPlay’s popular Plugged In freestyle series, have become symbols of creative resilience admired by music insiders like TK.

Another independent label rep, speaking on condition of anonymity for fear of reprisals from the Met, says that music video takedowns can have something of a Streisand Effect – when attempts to hide or censor something end up attracting more attention than the thing might otherwise have received. Banned videos being re-uploaded to Pornhub, for instance, becomes a story in itself. 

The label rep suggests that some artists are now using the inevitability of a video takedown to drive more listeners to Spotify, where they receive a marginally better payday for their streams. (It’s not clear if Project Alpha also extends to streaming platforms like Spotify.) For most, though, having a video taken down represents a significant financial hole – one that upcoming artists in particular struggle to pull themselves out of.

At the same time, drill lyrics and music videos have become an increasing presence in UK courtrooms. The circumstances under which they arrive there, however, have been questioned by legal experts.

A February 2021 report by the human rights charity JUSTICE has described “the misuse of drill music to secure convictions” as “one of the most profound examples” of systemic racism in the UK.

Recent analysis by an LSE Associate Professor of Law of more than 30 appeal cases in which rap materials had been admitted as evidence, revealed how prosecutors can lean on narrow tropes and assumptions about the music to “help build a gang narrative”.

These arguments, its author Dr Abenaa Owusu-Bempah notes, are often assisted by police officers presenting themselves as experts on rap lyrics and music videos. Both their expertise and their neutrality in such cases has been called into question.

“In the dozens of police reports I’ve seen, I have yet to come across any that really recognise that rap is a creative or enterprising activity, nor that it’s fictional and follows formulas. It’s always the criminological lens and the literal reading,” says Professor Eithne Quinn, a senior lecturer at Manchester University, and co-founder of the Prosecuting Rap research project. “Such a framing is crude but – because of powerful, pre-existing stereotypes – it’s effective.”

Keir Monteith QC, a leading barrister with the Garden Court Chambers Crime Team, has also highlighted the limited expertise of police officers when it comes to analysing rap lyrics – concluding that “directly or indirectly, the state relies on racist stereotypes” in its pursuit of convictions.

For TK, it’s absurd that YouTube, let alone judges and juries, would be reliant on a police officer’s rap analysis. “It’s tapped,” he says. “How can they decipher what that art means? Even me, sometimes I have to ask the artist what this new word or that phrase means.”

Increased awareness of this issue has pushed defence counsels into action. Professor Quinn has been coordinating a network of artists, music industry professionals, academics, and journalists (full disclosure: this includes myself) to respond to requests for expert witnesses from defence lawyers.

“Most of the recent cases involving rap evidence are being prosecuted as gang-related. This is a way of sweeping young people on the periphery of violent incidents into charging decisions for the most serious crimes,” says Quinn, adding that investigators “trawl widely for rap evidence from the digital lives of young suspects”.

 Countering these claims can feel like an uphill struggle. The more success prosecutors have with entering lyrics and music videos into evidence, and obtaining convictions, the more they pursue this course of action. It would appear that the same holds true for the Met’s cosy – and opaque – relationship with YouTube.

@wf_pritchard