A New Lawsuit Accuses Meta of Inflaming Civil War in Ethiopia

The suit claims the company lacks adequate moderation to prevent widespread hate speech that has led to violence and death.
An injured woman face in pain is carried by a group on a stretcher. Aerial view.
A lawsuit filed against Meta in Kenya alleges that the company has allowed hate speech to run rampant on the platform, causing widespread violence in Ethiopia. Photograph: Yasuyoshi Chiba/Getty Images

On November 3, 2021, Meareg Amare, a professor of chemistry at Bahir Dar University in Ethiopia, was gunned down outside his home. Amare, who was ethnically Tigrayan, had been targeted in a series of Facebook posts the month before, alleging that he had stolen equipment from the university, sold it, and used the proceeds to buy property. In the comments, people called for his death. Amare’s son, researcher Abrham Amare, appealed to Facebook to have the posts removed but heard nothing back for weeks. Eight days after his father’s murder, Abrham received a response from Facebook: One of the posts targeting his father, shared by a page with more than 50,000 followers, had been removed.

"I hold Facebook personally responsible for my father's murder," he says.

Today, Abrham, as well as fellow researcher and Amnesty International legal adviser Fisseha Tekle, filed a lawsuit against Meta in Kenya, alleging that the company has allowed hate speech to run rampant on the platform, causing widespread violence. The suit calls for the company to deprioritize hateful content in the platform’s algorithm and to add to its content moderation staff.

“Facebook can no longer be allowed to prioritize profit at the expense of our communities. Like the radio in Rwanda, Facebook has fanned the flames of war in Ethiopia,” says Rosa Curling, director of Foxglove, a UK-based nonprofit that tackles human rights abuses by global technology giants. The organization is supporting the petition. “The company has clear tools available—adjust their algorithms to demote viral hate, hire more local staff and ensure they are well-paid, and that their work is safe and fair—to prevent that from continuing.”

Since 2020, Ethiopia has been embroiled in civil war. Prime Minister Abiy Ahmed responded to attacks on federal military bases by sending troops into Tigray, a region in the country’s north that borders neighboring Eritrea. An April report released by Amnesty International and Human Rights Watch found substantial evidence of crimes against humanity and a campaign of ethnic cleansing against ethnic Tigrayans by Ethiopian government forces. 

Fisseha Tekle, Amnesty International's lead Ethiopia researcher, has further implicated Facebook in propagating abusive content, which, according to the petition, endangered the lives of his family. Since 2021, Amnesty and Tekle have drawn widespread rebuke from supporters of Ethiopia’s Tigray campaign—seemingly for not placing the blame for wartime atrocities squarely at the feet of Tigrayan separatists. In fact, Tekle’s research into the countless crimes against humanity amid the conflict fingered belligerents on all sides, finding the separatists and federal Ethiopian government mutually culpable for systematic murders and rapes of civilians. Tekle told reporters during an October press conference: “There’s no innocent party which has not committed human rights violations in this conflict.”

In a statement Foxglove shared with WIRED, Tekle spoke of witnessing “firsthand” Facebook’s alleged role in tarnishing research aimed at shining a light on government-sponsored massacres, describing social media platforms perpetuating hate and disinformation as corrosive to the work of human rights defenders.

Facebook, which is used by more than 6 million people in Ethiopia, has been a key avenue through which narratives targeting and dehumanizing Tigrayans have spread. In a July 2021 Facebook post that remains on the platform, Prime Minister Ahmed referred to Tigrayan rebels as “weeds” that must be pulled. However, the Facebook Papers revealed that the company lacked the capacity to properly moderate content in most of the country’s more than 45 languages

Leaked documents shared by Facebook whistleblower Frances Haugen show that parent company Meta's leadership remained well-informed of the platform's potential for exacerbating political and ethnic violence throughout the Tigray war, earning Ethiopia special attention at times from the company's premiere risk and response team. By at least 2021, the documents show, conflict in the country had raised enough alarms to warrant the formation of a war-room-like operation known as an IPOC, a process Facebook created in 2018 to respond rapidly to political "crisis moments." 

Relative to its usual content moderation processes, IPOC is viewed internally as a scalpel, deployed not only to anticipate emerging threats but triage cases of "overwhelming abuse" spurred on by political flash points. This includes the use of so-called "break the glass" measures: dozens of "levers" IPOC teams can deploy during exceptionally inciteful events to quell spikes in hate speech on the platform. In the US, for instance, this included the November 2020 election and subsequent attack on the US Capitol. 

In testimony before the US Senate last fall, Haugen likened the violence in Ethiopia to the genocide of more than 25,000 Rohingya Muslims in Myanmar, war crimes for which Facebook has been internationally condemned for its role in instigating, by the United Nation's Human Rights Council, among others. "What we saw in Myanmar and are now seeing in Ethiopia are only the beginning chapters of a story so terrifying no one wants to read the end of it," Haugen, a former Facebook product manager, told lawmakers. 

As late as December 2020, Meta lacked hate speech classifiers for Oromo and Amharic, two of the major languages spoken in Ethiopia. To compensate for its inadequate staff and in the absence of classifiers, Meta’s team searched for other proxies that would allow them to identify dangerous content, a method known as network-based moderation. But the team struggled because they found, for reasons not immediately clear, that Ethiopian users were far less likely to perform actions that Facebook had long used to help detect hate speech, which included the appearance of too many "angry" face reactions. One internal proposal suggested ditching this model entirely, replacing it instead with one that lends greater weight to other "negative" actions, such as users un-liking pages or hiding posts. It's not clear from the documents whether the proposal was accepted.

In its 2021 roadmap, Meta designated Ethiopia as a country in “dire” risk of violence, and in an assessment of the company’s response to violent and inciting content, it ranked its own capacity in Ethiopia as a 0 out of 3. Yet, in another document, a Meta staff member acknowledged that the company lacked “human review capacity” for Ethiopia in the run-up to the country’s elections. 

The petitioners have asked the high court to issue a declaration naming Meta responsible for violating a slate of basic rights guaranteed under Kenya’s Constitution of 2010: the right to freedom of expression and association; the right not to be subjected to violence or have information about one’s family or private affairs unnecessarily publicized; and the right to equal protection under the law, among others. Moreover, the petitioners have asked the court to order the establishment of a victims' fund of over $2 billion, with the court itself dispersing the funds on a case-by-case basis. Lastly, they’ve asked the court to compel Facebook to declare that its algorithms will no longer promote inciteful, hateful, and dangerous content and demote it wherever found, in addition to rolling out new crisis mitigation protocol “qualitatively equivalent to those deployed in the US,” for Kenya and all other countries whose content Meta moderates from Nairobi. 

“Kenya is the content moderation hub for posts in Oromo, Tigrinya, and Amharic. These are the only three Ethiopian languages, out of the 85 spoken in the country, that Facebook’s current content moderators can even attempt to cover,” says Curling. “There are currently 25 Facebook content moderators working on Ethiopian-related content for a country of 117 million people. Decisions by these individuals, forced to work in awful and unfair conditions, about what posts are taken down and what remains online are taken in Kenya, and it is the Kenyan courts, therefore, that need to determine both men’s legal challenge.”

Meta spokesperson Sally Aldous told WIRED that hate speech and incitement to violence are against the company’s policies. “Our safety and integrity work in Ethiopia is guided by feedback from local civil society organizations and international institutions,” she says. “We employ staff with local knowledge and expertise, and continue to develop our capabilities to catch violating content in the most widely spoken languages in the country, including Amharic, Oromo, Somali, and Tigrinya.” 

Aldous did not address whether the company has more than 25 moderators focused on the country and whether the company has an IPOC team focused on the conflict in Tigray, beyond the country’s election cycles.

Meanwhile, in the wake of his father’s death, Amare and his family were forced to flee their home. He is currently awaiting the outcome of his asylum claim in the United States.

“Every dream we had collapsed,” he says.