Facebook accused of ignoring Covid-19 misinformation in Europe

Facebook is said to act on less than half of the misinformation verified in non-English European languages ​​- half as much as when the content is in English.

Campaign group Avaaz analyzed misinformation about Covid-19 posted between December 7, 2020 and February 7, 2021 that was verified by Facebook’s third fact-checking partners or other reputable organizations. He selected material that was rated “fake” or “misleading” and could cause public harm.

And, it found, Facebook did not treat 56 percent of this misinformation in major non-English European languages, compared to only 26 percent of the English-language content revealed by U.S. fact-checkers.

“Facebook has a big deadlock in Europe for the Covid / anti-vax misinformation,” says senior glob al campaignist Andy Legon. “Just as the EU is facing a deadly third wave.”

According to the report, Italian speakers are the least protected from misinformation, and no action has been taken on 69 percent of Italian content. Spanish speakers were the best protected, and only 33 percent of Spanish misinformation went unanswered.

On average, it took Facebook almost a week more to tag non-English fake content, taking 30 days to act, compared to 24 days for fake content in English.

The biggest misinformation topic was vaccination side effects – including the claim that Bill Gates had warned of hundreds of thousands of deaths. The second was a false claim about official measures or warnings, while the third most popular claimed that masks were either dangerous or useless.

Although Facebook says it uses the same approaches to misinformation regardless of language, Avaaz found that where posts were translated into multiple languages ​​the English version was far more likely to be removed.

Avaaz is urging the EU to do more to force Facebook to eradicate misinformation related to Covid-19 and vaccines in Europe.

“The current EU Code of Practice on Disinformation does not cover the shortcomings identified in this report,” it says.

“That’s why we urgently need a revised version that pushes social media giants to reveal the amount of misinformation on their platforms and set clear targets for their reduction, which is overseen by an independent regulator.”

And he seems to be on his way, with Vera Jourova, Vice President for Values ​​and Transparency of the European Commission, chatter: “Despite the improvements, FB and other platforms need to do more to ensure that their policies are vigorously implemented around the world. We are therefore working to revise the Code of Practice against #disinformation.”


Leave a Reply