UPDATE: Nov. 10, 2020, 4:18 p.m.: This story has been updated with Facebook’s response.
CORK, Ireland — A law firm representing Facebook moderators who are suing the tech giant for “psychological trauma” has been blocked from placing Facebook ads aimed at convincing more content reviewers to come forward and take legal action.
Videos by VICE
Coleman Legal, an Irish law firm based in Dublin, has already filed lawsuits against Facebook on behalf of content moderators who have been diagnosed with PTSD as a result of repeated exposure to graphic and violent content.
Now, they want to speak to more of the thousands of moderators who are employed by third-party companies to filter out the most heinous content posted on Facebook. And they want to use Facebook ads to find them.
The firm made repeated attempts to place a “lead generation campaign” on Facebook’s ad platform, but in each case was rejected without explanation, beyond the fact that the ad breached the company’s “Personal Attribute Policy.”
The ads, which included a link to a website where moderators could register their interest in having their case evaluated, were rejected despite having identical text to a previously approved ad:
“Over the last few years, there has been an unprecedented increase in Content Moderators being diagnosed with PTSD. This significant rise has been imputed by daily exposure of disturbing content, bestiality, drug abuse, animal cruelty, violence, and more which is shockingly enough, a crucial part of their job requirement. Get all concerns addressed with an experienced solicitor by filling the Free Case Evaluation Form. Coleman Legal has a dedicated team of experienced solicitors who can offer help, advice, and tailored guidance. Compensation for this claim may be the first step on your road to recovery.”
The ad was rejected in total six times over the course of a week at the end of September and beginning of October. Despite multiple appeals, Facebook did not give the firm a detailed explanation of why it rejected the ad, but solicitor David Coleman believed the reason is obvious.
“Facebook is dealing with this matter in a very commercial way, and they are clearly seeking to limit the number of moderators who will seek redress,” Coleman told VICE News. “This isn’t just a commercial matter, this is a social and a personal matter and deeply personal to these people.”
However, after this story was published, Facebook told VICE News that it had rejected the ads because they “assert or imply the reader has a medical condition” — which goes against Facebook’s policies. It also said the previous ad Coleman Legal had run was approved in error and should not have been allowed.
More than two-dozen current and former Facebook moderators have already signed on for a test case with Coleman Legal, according to a source with knowledge of the matter, who was granted anonymity as they were not authorized to speak on the record. One of those includes a full-time Facebook employee, potentially exposing the company to a much greater financial risk than being sued by moderators, who are low-paid contractor workers.
While Facebook has settled a case with moderators in California for $52 million, the source speaking to VICE News said Facebook may not settle in Ireland because, as a result of Europe’s much stricter workplace-safety laws, it could result in much higher payouts to claimants than lawsuits in the U.S..
Coleman has appealed to Facebook’s new Oversight Board and written to Facebook’s head of global affairs, Nick Clegg, to highlight the issue of the rejected ad. Included in the letter to Clegg were redacted medical reports from two of the former content moderators who are pursuing a case against Facebook.
Both reports — which were reviewed by VICE News — were written by consultant Dr. Ann Leader, and include damning testimony about the impact that content moderation has had on the lives of the moderators and their families.
“It became more and more difficult to pass the high standard of performance set by Facebook for joining these queues. She had to view graphic sexual content and watch violence, bestiality, pedophilia, and other horrible content. She told me she recalls watching a boy aged about nine or ten ‘putting his willy into a chicken’. She also watched children having sex with children and people having sex with animals. She watched the rape of young girls and other gruesome and disturbing images. She saw beheadings and people being set on fire,” Leader said in her report about a 37-year-old woman, concluding the woman had suffered “a very significant adjustment disorder with anxiety and post-traumatic stress disorder” as a result of her work.
Coleman told VICE News that the ad was attempting to reach people working as content moderators who may be similarly affected but who don’t know about legal avenues to redress them.
One current Facebook moderator told VICE News that the rejection of the ad was a “prime example of censorship,” while much more troubling content remains on the platform.
“This is clearly a case of that where the intention of the published ads was to promote the awareness of workplace safety, a genuine one, but some reviewers, likely Facebook’s direct employees, decided to take them down and constantly reject it as it goes against their interest,” said the moderator, who was granted anonymity to speak openly.