London/Washington, D.C. – 9 June 2022: Our new investigation, in partnership with legal non-profit Foxglove and independent researcher Dagim Afework Mekonnen, shows that Facebook continues to approve violent ads calling for killing, starvation and ethnic cleansing in Ethiopia as the country experiences an ongoing civil war. Despite Facebook’s commitments to prevent hate speech on its platform, it continues to fail to do so.

The investigation shows how Facebook approved for publishing 12 real-life Amharic-language inflammatory and violent ads – all of which had previously been posted on Facebook, with many having been removed by them for violating their hate speech policy.

Facebook has been accused by whistleblower Frances Haugen of “fanning ethnic violence” for facilitating hate speech in Ethiopia – a country where thousands of people have been killed, millions have been displaced, and there is credible evidence of war crimes having been committed since civil war broke out in 2020. Facebook itself has designated Ethiopia as one of the highest priority countries for its apparent security and safety measures to detect and root out hate speech.

When asked for comment on the investigation findings, Facebook responded to Global Witness that the ads shouldn't have been approved and said that they’ve invested heavily in safety measures in Ethiopia, adding more staff with local expertise and building their capacity to catch hateful and inflammatory content.

Following this response, Global Witness again submitted two hate speech ads to see if Facebook had prioritized detection of such ads. Alarmingly, Facebook approved both additional hate speech ads.

“It is absolutely unacceptable that Facebook continues to approve ads inciting genocide in the midst of a conflict that has already taken the lives of thousands of people and forced millions more to flee their homes. Facebook falsely touts its ‘industry-leading’ safety and security measures to keep hate speech ads off its platform but our investigation shows that there remains a glaring gap in Ethiopia, which the company claims is one of its highest priority countries. This apparent lack of regard for the people using their platform has real-world deadly consequences,” said Ava Lee, Digital Threats to Democracy Team Lead at Global Witness. 

“Even after approaching Facebook with the findings of our investigation, they continued to allow ads inciting violence and genocide to be approved for publishing,” said Lee. “Facebook must do better.”

The hate speech examples we used are highly offensive and we are therefore deliberately not repeating all of the phrases here. The sentences used included violent speech that directly calls for people to be killed, starved or ‘cleansed’ from an area and dehumanizing speech that compares people to animals. Several of them amount to a call for genocide.

This investigation follows from a recent March 2022 Global Witness investigation which showed that Facebook allows Burmese language hate speech ads inciting violence and genocide against the Rohingya in Myanmar.

Dagim Afework, the researcher in the project, said “Facebook profoundly profits from the attention that harmful content brings, but vulnerable communities including those in Ethiopia are suffering the consequences. Clearly, Facebook lacks the incentive and the willingness to meaningfully address this issue. Therefore, there should be global regulatory pressure on the company so that it significantly invests in both automated and manual harmful content moderation. And this needs to be done now before more and more communities are damaged beyond repair.”

Rosa Curling, Director of Foxglove, a legal action non-profit who partnered in the investigation, said “When ads calling for genocide in Ethiopia repeatedly get through Facebook’s net – even after the issue is flagged with Facebook – there’s only one possible conclusion: there’s nobody home. Years after the Myanmar genocide, it is clear Facebook hasn’t learned its lesson. It prefers not to hire, train, and value enough people to cope with the sheer scale of violence in Ethiopian languages. Enough is enough. Facebook needs to vastly scale up its moderation operation and give moderators all the rights and support it would offer Facebook staff.”

Global Witness calls on Facebook to:

  • Properly resource content moderation in all the countries in which they operate around the world, including paying content moderators a fair wage, allowing them to unionise and providing psychological support.
  • Publish information on what steps they’ve taken in each country and for each language to keep users safe from online hate.
  • Publish the ‘multiple forms of human rights due diligence’ that they’ve undertaken in Ethiopia and conduct an independent human rights review of its work in Ethiopia, as recommended by the company’s oversight board, and to publish the findings.

Global Witness calls upon governments – notably the United States – to follow the lead of the EU and regulate Big Tech companies and force meaningful oversight, including requiring the platforms to assess and mitigate the risk that their services allow hate speech to flourish.

For a more detailed list of our recommendations, see our article.