Facebook’s policing of vitriol is even more lackluster outside the US, critics say

Digital activists around the world are urging Facebook to take seriously how its algorithm incites misinformation and ethnic violence

On a cloudy evening in Nairobi, Berhan Taye is scrolling through a spreadsheet in which she has helped document more than 140 Facebook posts from Ethiopia that contain hate speech. There are videos of child abuse, texts of hate speech against different ethnic groups, and hours-long live streams inciting hatred. These posts breach Facebook community guidelines in any context. Yet for Taye and her colleagues, this is what Facebook’s news feed has looked like for years in Ethiopia.

Because there aren’t enough content moderators focused on Ethiopia, it has been up to Taye, an independent researcher looking at technology’s impact on civil society, and a team of grassroots volunteers to collect and then report misinformation and hate speech to Facebook.

It’s dangerous work – people who put out the hate speech are organized – so volunteers are anonymous. They spend hours watching violent live streams and collating hateful content. It takes a toll on their mental health.

Once they send their report over email it can take a week for Facebook to respond – if they’re lucky – and sometimes 70% of the content will be removed, according to Taye. In some situations, the big tech company has come back to the activists requesting translations of the content. “Over and over again, we’re seeing they’re actively failing every time,” Taye says.

“They’re not willing to invest in human rights, invest in resources, and in languages that are not making them money.”

Facebook disputes that it does not crack down on abuse with the same intensity outside the US, saying it spends $13bn globally to tackle this in work that involves 15,000 people across dozens of languages.

Researchers like Taye say that’s not enough.

In June, Facebook reported it had removed a network of fake accounts in Ethiopia targeting domestic users ahead of the country’s elections.

Taye, however, said she has been in conversations with Facebook since 2018 about the situation in Ethiopia, a country where there has been ethnic cleansing, where armed conflict is escalating, and where Facebook is a crucial platform for information.

Now, Taye is calling for Facebook to release any human rights impact assessment reports it may hold on Ethiopia.

Like many digital activists around the world, Taye and her colleagues have been urging Facebook for years to take seriously how its algorithm incites misinformation, hate-speech, and ethnic violence in non-English speaking regions.

It’s an issue Facebook whistleblower Frances Haugen highlighted in her testimony to US congress at the beginning of October, when she said Facebook’s system of content ranking had led to the spread of misinformation and hate speech.

Content ranking works by using machine-learning models to remove or demote bad content, but it is only trained for certain types of content. Haugen said Facebook knows: “Engagement-based ranking is dangerous without integrity and security systems.”

She added the problem was far worse in regions where posts are in languages other than English. She said the “strategy of focusing on language-specific, content-specific systems for AI to save us is doomed to fail”.


For digital rights activists, Haugen’s testimony in Congress came as no surprise. “We’ve been the victims of that,” Taye says. “It’s good for Americans to know. But we’ve been saying this. The first thing you see when you open your Facebook is the most heinous content.

“What do they think the Rohingya were saying? What was [Philippines journalist] Maria Ressa saying? Most Facebook users are not in America and Europe,” she says.

Haugen was the whistleblower who gathered documents that formed the Wall Street Journal’s Facebook Files investigation. The WSJ reported that one internal document revealed that Facebook’s work on misinformation in 2020 included 3.2m hours of searches, but only 13% of this was outside the US; more than 90% of Facebook users are outside the US. Facebook disputes the 13% statistic, which it says reflects just one program of many.

Networks of digital rights and human rights activists around the world have been pressing Facebook to release their reports and to run risk-assessments before they enter markets.

Eliška Pírková, the global freedom of expression lead of Access Now, a human rights organisation, called for human rights-centric regulation of online platforms such as Facebook. She said Facebook users needed to be protected by default from dark patterns that are a result of interface design that nudges users towards certain behaviour.

Haugen’s testimony confirmed what civil society already knew, Pírková said, and revealed the company’s “inherent opacity and unwillingness to disclose information and how algorithms operate”.

“Civil society shouldn’t have to hold Facebook to account,” she said, adding engagement with the company had not been very meaningful and there had been no follow-up.

She pointed to Facebook’s moderation process during the events of May 2021, when Palestinians were evicted from Sheikh Jarrah in Jerusalem and during the 11-day bombardment of Gaza, which led to mob violence against Palestinians incited on WhatsApp groups, while pro-Palestine posts were removed from Facebook platforms.

If Facebook did not learn lessons from the past, it would be countries in the global south and historically oppressed and marginalized groups that would “pay the highest price for our mistakes”, she said.

Myanmar is an often cited case study when it comes to the catastrophic impact of disinformation and hate speech shared on Facebook. Myanmar became a “textbook example of ethnic cleansing” according to the UN, where in August 2017 more than 700,000 Rohingya were forced to flee violence in Rakhine state.

The country has seen a rapid rise in Facebook users: there were 1.2 million Facebook users in Myanmar in 2014, and by January 2019 there were 21 million.. By January 2021 there was 23.65 million users, about 40% of the population.

Victoire Rio, a digital rights researcher focusing on Myanmar, said Haugen’s testimony shone a spotlight on the discrepancies between what Facebook does in the US and the “lack of action and intervention” in the rest of the world.

At the beginning of Facebook’s presence in Myanmar, there were only two Burmese moderators at Facebook. Now there are 120, according to Rio.

“The amount of investment that’s going into trying to clean up and sanitize the content that gets through in the US is just not there in other parts,” Rio said. “But it took a genocide, it took the UN calling them out on it, and took the US Congress calling them out on it, the western press calling them out on it for, for us to finally be heard,” she said.

In a statement, a Facebook spokesperson said: “Our track record shows that we crack down on abuse outside the US with the same intensity that we apply to it within the US. We have invested $13bn globally to tackle this challenge and have 15,000 people reviewing content outside the US, covering more than 50 languages and working in more than 20 locations across the world.

“Our third-party fact-checking program includes over 80 partners who review content in more than 60 languages, with over 70 of those partners located outside of the US. We have also taken down over 150 networks seeking to manipulate public debate since 2017, and they have originated in over 50 countries, with the majority coming from or focused outside of the US.”


Aisha Gani in London

The GuardianTramp

Related Content

Article image
Facebook’s role in Myanmar and Ethiopia under new scrutiny
Whistleblower Frances Haugen adds to long-held concerns that social media site is fuelling violence and instability

Emmanuel Akinwotu

07, Oct, 2021 @5:00 AM

Article image
Revealed: Facebook’s global lobbying against data privacy laws
Social network targeted legislators around the world, promising or threatening to withhold investment

Carole Cadwalladr and Duncan Campbell

02, Mar, 2019 @2:00 PM

Article image
The inside story of Facebook’s biggest setback | Rahul Bhatia
The Long Read: The social network had a grand plan to connect millions of Indians to the internet. Here’s how it all went wrong

Rahul Bhatia

12, May, 2016 @5:00 AM

Article image
Facebook’s news feed change won’t help social media addiction | Eleni Stefanou
Mark Zuckerberg says he wants to ensure Facebook is good for people’s wellbeing – but its business model remains fundamentally the same says Eleni Stefanou, who is responsible for Guardian social media campaigns

Eleni Stefanou

15, Jan, 2018 @12:21 PM

Article image
Facebook’s refusal to fact-check political ads is reckless
The social network’s bosses hail the power of targeted campaigns, but won’t take responsibility

Emily Bell

12, Jan, 2020 @2:00 PM

Article image
Facebook’s Safety Check leads technology’s support of Paris
Social network activates feature previously used during natural disasters, while other apps and tools help those caught in aftermath of terrorist attacks

Samuel Gibbs

16, Nov, 2015 @12:34 PM

Article image
Facebook’s housing plan is hypocrisy | Letters
Letters: Perhaps the US state, like those elsewhere, struggles to afford infrastructure because of the huge lengths companies go to in order to avoid paying tax


10, Jul, 2017 @5:41 PM

Article image
The Guardian view on Facebook’s business: a danger to democracy | Editorial
Editorial: The conceit of data mining firms is that they could win elections by moulding electorates based on new identities and value systems – a process accelerated by the echo chamber of social media


17, Apr, 2018 @5:33 PM

Article image
Facebook’s red notification box has controlled me for too long. No more…
Supposedly, Facebook is about keeping in touch, but in reality it’s keeping score

Coco Khan

27, Apr, 2018 @1:00 PM

Facebook gets even more face-to-face thanks to Skype partnership

Rolling out in the next few weeks, Mark Zuckerberg says the feature brings together two of the web's most popular consumer services

Jemima Kiss

06, Jul, 2011 @7:28 PM