Kenyan High Court Greenlights Case Against Facebook Over Hate Speech and Violence

Milimani High Court has given a green light for a legal case against murder, death threats and hate speech enabled by Facebook’s toxic algorithm to proceed to its logical conclusion.

The case was case was filed by Ethiopian nationals Abrham Meareg and Fisseha Tekle, who accuse Facebook of exacerbating violence in Ethiopia, particularly during the Tigray conflict between 2020 and 2022.

The Katiba Institute, a Kenyan legal organization dedicated to upholding the constitution, is also a petitioner in the case.

This lawsuit was sparked by the tragic murder of Abrham’s father, Professor Meareg Amare Abrha, a distinguished chemistry professor at Bahir Dar University. He was murdered in 2021, after his home address and posts calling for his murder were published on Facebook.

Another victim is Fisseha, a former researcher at Amnesty International who published independent reports on violence by all sides in the Tigray conflict.

Meta’s Jurisdictional Challenge Dismissed

Rather than addressing the core accusations, Facebook’s parent company, Meta, argued that Kenyan courts lack the jurisdiction to hear the case, claiming that Meta is not a Kenyan company and does not operate within the country. Meta insisted that any claims should be filed in U.S. courts.

However, the Milimani High Court ruled that the case rightfully falls under Kenyan jurisdiction. This decision aligns with previous rulings in two other cases against Meta—one involving the exploitation and unlawful dismissal of Facebook content moderators in Kenya in 2022 and 2023.

In all instances, the Kenyan courts, including the Employment and Labour Relations Court and the Court of Appeal, dismissed Meta’s jurisdictional objections.

Legal Demands Against Meta

The petitioners seek substantial reforms and reparations from Facebook, arguing that its algorithm promotes violent content. Their demands include:

• A formal apology from Meta for the murder of Professor Meareg Amare Abrha.

• The establishment of a restitution fund for victims affected by hate speech and violence incited on Facebook. Initially, they are seeking 250 billion KSH ($2 billion) for harm caused by organic content and an additional 50 billion KSH ($400 million) for harm from sponsored posts.

• Modifications to Facebook’s algorithm to prevent the amplification of hate speech and incitement to violence.

• The demotion of violent content, including death threats and doxing, similar to the emergency measures Facebook implemented after the January 6, 2021, U.S. Capitol riots.

• The hiring of additional content moderators to prevent further harm, particularly in East and Southern Africa, with a focus on Ethiopia.

This case represents a significant legal challenge to social media platforms and their responsibility in curbing hate speech and online incitement to violence. If successful, it could set a precedent for holding tech giants accountable for their content moderation policies and their real-world consequences.