Want the best of VICE News straight to your inbox? Sign up here.
DUBLIN — When Valera Zaicev began working in Dublin as one of Facebook’s moderators a couple years ago, he knew he’d be looking at some of the most graphic and violent content on the internet.
Videos by VICE
What he didn’t know was that Facebook would be counting the seconds of his bathroom breaks.
“People have to clock in and clock out even when going to the toilet and explain the reason why they were delayed, which is embarrassing and humiliating,” Zaicev told VICE News.
Facebook, which outsources the majority of its content moderation to over 15,000 third-party contractors, didn’t always keep those employees on such a tight leash. When Zaicev, 33, joined Facebook’s moderation army in July 2016, he found a professional workplace where he felt he received in-depth training and excellent treatment.
But that all soon changed. As the number of moderators in the Dublin offices exploded — rising from 120 to over 800 in two years — the conditions deteriorated and training for moderators all but evaporated.
By 2018, the number of content moderators worldwide continued to grow into the tens of thousands, and Facebook began testing a “time management system” designed to monitor every single minute of their day — including lunch breaks, training, “wellness breaks” for counseling or yoga, and even the time they spent on the toilet, according to Zaicev and one current contracted employee, who did not want to be identified.
In the past few years, Facebook has tried to fight back against criticism over how it deals with horrific content on the platform. The company has spent at least half a billion dollars hiring human moderators, in addition to the algorithms that already police its pages.
“People have to clock in and clock out even when going to the toilet.”
Because of growing regulatory oversight from governments around the world, the work these moderators do is essential. They’re the company’s first defense against horrific and illegal material, and yet, they say the company pays them poorly and fails to provide adequate support. The result has been a slew of mental health issues and, last month, the first of many expected lawsuits for failing to provide a safe workplace.
Increasingly, Facebook moderators say their every move is monitored. When making decisions about content, for example, moderators have to follow an ever-changing document they call “the bible.” And each moderator is given a “quality score.”
“You’re allowed four or five mistakes a month — a 2% failure rate, 98% quality score,” said Chris Gray, a former Facebook moderator who worked at the company for 11 months. “So if you come in, and it’s Tuesday or Wednesday, and you’ve got five mistakes, you are fucked for the month, and all you can think about is how to get the point back.”
Gray is suffering from PTSD as a result of his time working as a Facebook moderator, and last month he filed a lawsuit in the Irish courts against the social network and CPL, one of a number of companies Facebook works with that employs thousands of human content reviewers in the U.S., Europe, and Asia. The lawsuit accuses the company of causing “psychological trauma” due to poor working conditions and having to view graphic material.
Dozens if not hundreds of moderators are expected to file similar lawsuits. A source at Gray’s law firm, Coleman & Partners, told VICE News that new documents would be submitted to the High Court in Dublin this month.
READ: Bestiality, stabbings, and child porn: Why Facebook moderators are suing the company for trauma
CPL did not respond to VICE News’ questions about the claims made by current or former moderators, and Facebook declined to comment on making its moderators log every minute of their day.
“The well-being of those that review content for Facebook is and will remain our top priority,” Drew Pusateri, a Facebook spokesman, told VICE News.
But moderators said that’s simply not the case. While Facebook has made counseling sessions available, many moderators feel they simply can’t take advantage of them, because of the constant monitoring, ever-changing policies, and unrelenting pressure to meet quality standards.
“There are thousands of moderators right across the EU, and all of them are working in conditions that are unsafe for their mental health and, in our view, unlawful,” said Cori Crider, the director of Foxglove, a U.K.-based advocacy group assisting in the lawsuits. “European laws protecting workers are strong, and we believe Facebook and the rest of the social media firms need to do far more to create a safe workplace for moderators.”
Following “the bible”
Facebook moderators review and remove content flagged by billions of Facebook users around the globe as well as the company’s artificial intelligence algorithms. The vast majority of the content is relatively mundane, but some is deeply disturbing, like sex abuse, bestiality, and brutal violence.
Every day when moderators clock in, they’re given what’s called a game plan, a schedule that lays out how many pieces of content moderators have to address, and from which queues, including hate speech, harassment, and even threats of self-harm.
Gray said CPL also gives moderators an average handling time, or the amount of time they’re allowed to spend on each piece of content. The time differs depending on the type of content being reviewed, but Gray said it was typically under one minute.
While some pieces of content are obvious violations and demand little mental energy, others aren’t so clear-cut and require consulting Facebook’s Implementation Standards, what some moderators refer to as “the bible.”
The 10,000-word document has 24 different categories, broken down into three groups: harmful behavior, sensitive content, and legal violations, according to a copy published in March 2018 and reviewed by VICE News.
According to the rules, moderators can choose to either ignore the content or delete it. If they choose to delete it, they have to describe the content in granular detail and tag it in multiple categories and sub-categories. The information is then fed back into Facebook’s algorithms. That means the moderators are, in effect, training a piece of software that could someday replace them.
While “the bible” is a helpful guide, it’s updated every two weeks in vague generalizations to cover the wide gamut of content posted on Facebook.
“You’ve got to have generic rules that can be applied easily by everybody in all kinds of situations,” Gray said.
In a lot of situations, moderators simply don’t know what action they should take.
“I’ve had days when there are 10 of us standing around looking at a screen, and someone has got the policy documents open on another screen, and we’re looking and we’re arguing about how to apply the policy to this video,” Gray said.
And in most cases, moderators don’t have the option to escalate the problem to a more senior Facebook employee. No matter what, they have to make a decision, and if it’s wrong, that impacts their overall quality score.
Unrealistic quality standards
A team of auditors, who review a select sample of moderators’ decisions every month, determine whether or not moderators got a call wrong. But the auditors are just other moderators who happen to have above-average quality scores.
Moderators are given one chance to appeal auditors’ decisions to the auditors themselves, and it has to be done within 48 hours — and all the moderators VICE News spoke to said they have to appeal decisions to maintain their quality score of 98%.
Auditors mark moderators down for making the wrong decisions and leaving content on Facebook that should have been deleted, or vice versa. But the process also allows auditors to penalize moderators if they’ve taken the right action for the wrong reasons.
“You’ve got five mistakes, you are fucked for the month, and all you can think about is how to get the point back.”
But auditors are far from infallible and are often unable to articulate why they chose the outcome they did. When determining whether a post that advised a user to “probably kill yourself” should remain online or not, one auditor couldn’t give a definitive answer, according to screenshots of a discussion with a moderator seen by VICE News.
“[Auditors] know nothing about your market since they are from different countries and speak different languages,” Zaicev said.
But Facebook isn’t just failing to employ adequately skilled staff for the moderation process. The company also allows moderators’ personal details to get into the wrong hands.
Zaicev was among more than 1,000 Facebook moderators whose identities were accidentally revealed to the people whose accounts they were blocking. In Zaicev’s case, his information was revealed to members of the Donetsk People’s Republic (DPR) and Luhansk People’s Republic (LPR), two pro-Russian separatist groups operating in eastern Ukraine.
Facebook apologized to Zaicev and promised to better protect its employees — just the latest promise of many the social network has recently made in the wake of multiple scandals.
Facebook policies
Despite Facebook’s insistence that it’s improved conditions for workers, the new time management policy and the on-going demands of meeting the quality score has further eroded the time employees can use to de-stress after interacting with traumatizing content.
Facebook’s new time management tool, which forces moderators to log every minute of their shifts, has only added to the already stressful environment. The tool, which was rolled out to all contractors this year, is so efficient that if a moderator is away from their workstation, it logs the person out. Then they have to explain that gap in production to their managers.
The time management system also makes it hard for moderators to actually use the wellness programs that might offset the trauma they have to see on a daily basis. In many cases, instead of going to counseling or yoga, moderators end up spending their time arguing with auditors about overturning a decision or revising the latest version of “the bible.”
One current Facebook moderator said those who cover a busy market, such as the English-language region, are “not exactly encouraged” to take advantage of the wellness options, so it can be tough to get through the day without feeling mentally exhausted.
READ: Facebook is literally hiring people just to Google stuff
And the workload varies widely by market. Moderators in countries like the Philippines and Thailand, for example, have said they review as many as 1,000 pieces of content a day, while a moderator in Europe might see fewer than 150 in more difficult queues, such as child exploitation.
Managers aren’t always able to spot when moderators are having problems, whether because of the content they’re seeing, the stress they’re under, or a combination of both.
One moderator who worked at CPL for 14 months in 2017 and 2018 told VICE News that he decided to leave the company when a manager sanctioned him while he was having a panic attack at his computer. He’d just found out that his elderly mother, who lived in a different country, had had a stroke and gone missing.
“On the day I had the most stress in the world, when I think I might lose my mother, my team leader, a 23-year-old without any previous experience, decided to put more pressure on me by saying that I might lose my job,” said the moderator, who did not want to be identified.
Treated “like a criminal”
When the stress finally forces moderators to leave the company, many are afraid to speak out against Facebook or CPL because of nondisclosure agreements (NDAs) they sign when they start working there. Zaicev said that when he left, CPL forced him to sign a second NDA.
“I refused to sign this and was pressured into doing so,” he said. “After many refusals, I was escorted out of the building like a criminal before my last shift and before saying goodbye to my colleagues.”
The company also warns employees not to speak to the press. VICE News saw a copy of an email sent to CPL employees in Dublin in 2018 alerting them to an upcoming undercover report about the company from Channel 4. The note suggested talking points for employees to address any questions they received. “You might like to respond with something like: ‘Safety of the people who use Facebook is one of our top priorities,’” one part of the document reads.
Facebook also urged moderators to remove any references to Facebook on their LinkedIn profiles in case journalists contact them about controversies, according to Zaicev.
Facebook told VICE News it advises against referencing the company online “for reasons related to safety and security,” citing the shooting at YouTube’s headquarters in San Bruno, California, in April 2018.
Despite Facebook’s push for secrecy, Zaicev is now among dozens of current and former moderators who have contacted Coleman Legal Partners in Dublin about bringing legal action against the company for failing to provide a safe work environment. Unlike a recent class action case in the U.S., each moderator has to file a separate case in Ireland.
“The happiest people are the people who are away from Facebook. The more unhappy you are in life, the more you are going to spend on Facebook,” one fomer moderator who is preparing a legal action against Facebook told VICE News. “And we spent the whole fricking day on Facebook. We can probably guess that it is not good for you.”
Cover image: George Mdivanian / EyeEm via Getty Images