News

Facebook Said It’d Stop COVID Anti-Vaxxers, But It’s Letting Them Run Wild

COVID misinformation is flourishing on Facebook, especially in languages other than English.
The Patriots party supporters demonstrate for Freedom to protest against covid-19 restrictions and lockdown in Paris, France, on April 10, 2021. (Denis Prezat/Avenir Pictures/Abaca/Sipa USA)(Sipa via AP Images)
The Patriots party supporters demonstrate for Freedom to protest against covid-19 restrictions and lockdown in Paris, France, on April 10, 2021. (Denis Prezat/Avenir Pictures/Abaca/Sipa USA)(Sipa via AP Images)
Logo_Disinfo Dispatch Padding
Unraveling viral disinformation and explaining where it came from, the harm it's causing, and what we should do about it.

Facebook has become the de facto global social network, with users in almost every country in the world. The unprecedented reach the company has achieved has made its shareholders very rich, and its CEO arguably the most powerful man in the world.

But, despite most of its users living outside the U.S. in non-English speaking countries, the company has repeatedly shown its focus is almost entirely on the American market. And the latest example of that is how it deals with COVID-19 misinformation. 

Advertisement

Facebook’s efforts to eradicate COVID-19 and anti-vaxx misinformation in the U.S. has not been a massive success by any measure, but it has taken some steps in the right direction and has clamped down on this content much harder and faster than it did in the past.

But for people who don’t speak English—and that’s the vast majority of the world—Facebook has failed them.

In Europe, more than half of the COVID-19 misinformation flagged by independent fact-checkers as being misleading or outright false, was not acted upon by Facebook when it was in a major non-English language, according to a new report from digital rights advocacy group Avaaz.

The researchers found that when misinformation was posted in French, Portuguese, Spanish, or Italian, Facebook took action in just 44% cases—compared to 74% for English language content in the U.S.

But Facebook doesn’t even provide the same protections for English language speakers elsewhere in the world as it does for its U.S. users. Avaaz found that Facebook failed to deal with English-language misinformation in the U.K. and Ireland 50% of the time. 

This means that Europeans are at greater risk of seeing and interacting with COVID-19-related misinformation than those in the U.S.

And depending on where you live in Europe, the risks could be even greater. For example,  Italian speakers are least protected from misinformation: Avaaz found that Facebook failed to act on 69% of the Italian content they examined.

Advertisement

While Spanish speakers were most protected of this group—Facebook acted on 67% of Spanish language content—that’s still less protection than what’s provided to Americans.

The report also found that even when Facebook did label or remove misinformation in non-English speaking markets, it took longer to do so: up to 30 days on average, compared to 24 days for English-language content.

Possibly the most concerning finding is that after a year spent making big promises to tackle COVID-19 disinformation, the likelihood that Facebook will take action on fact-checked misinformation is actually worse now that it was a year ago: 55% in 2021 versus 56% in 2020.

Europe’s most popular misinformation narrative is about the purported side effects of vaccines. An article that falsely claims Bill Gates said that the COVID-19 vaccine could kill one million people received tens of thousands of interactions on Facebook without the company doing anything about it. The post remains live on the platform at the time of publication.

Facebook disputed Avaaz’s findings, saying it is “taking aggressive steps to fight harmful Covid-19 misinformation in dozens of languages.”

Facebook said the Avaaz report “is based on a small sample of data and does not reflect the work we've done to provide authoritative information to people.” 

Advertisement

Avaaz countered by pointing out that Facebook doesn’t allow for the collection of more data.

“The analysis in this report is based on a sample of misinformation detected by our investigation team that was also fact checked by independent fact checkers,” the report’s author said.

“Facebook is not transparent and does not provide data, which would allow for a more detailed analysis on the full scale of misinformation on its platform.”

The social network has struggled for years to deal with disinformation and hate speech in non-English speaking markets. A recent Guardian series highlighted how governments and authoritarian leaders have taken advantage of Facebook’s increasingly lax policy enforcement efforts.

In an attempt to force Facebook and others to do more to tackle disinformation, the European Union created a Code of Practice on Disinformation in 2018, and last year it created the COVID-19 Disinformation Monitoring Programme.

But both efforts rely on the platforms to self-report any problems, and to date these efforts have not produced the desired results. A revised version of the Code of Practice is set to be introduced in the coming weeks, and Avaaz is calling on EU officials to scrap self-regulation and introduce an independent regulator to monitor the scale of disinformation on social networks.

“It’s time the EU stopped counting on promises from the tech platforms. We're one year into this infodemic and the numbers in our report paint a picture of consistently poor performance,” Luca Nicotra, campaign director at Avaaz said in an emailed statement.

“Self regulation has failed. The new Code of Practice on Disinformation is the EU's last chance to ensure platforms can no longer mark their own exams. Instead the EU can require measurable impact by enforcing solutions that work; detoxifying their algorithms and informing EU citizens when they've been victimized by misinformation.”