Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
In April 2025, the Human Rights Court in Kenya rendered an unprecedented decision that he had the jurisdiction to hear a case on the harmful content on one of Meta’s platforms. The trial was filed in 2022 by Abraham Meareg, the son of an Ethiopian academic who was murdered after being Doxxe and threatened on Facebook, Fisseha Tekle, an Ethiopian human rights activist, who was also Doxy and threatened on Facebook and Katiba Institute, a non -profit Kenyan who defends constitutionalism. They argue that the design of Facebook algorithm and its content moderation decisions taken in Kenya led to damage to two of the applicants, fueled the conflict in Ethiopia and led to generalized human rights violations inside and outside Kenya.
The content in question is not part of the categories of protected discourse under article 33 of the Constitution of Kenya and includes propaganda for war, incentive to violence, hate speech and hatred advocacy which constitutes an ethnic incentive, defamation of others, an incentive to cause damage and discrimination.
The key to the Kenyan affair is the question of whether Meta, an American company, can benefit financially from unconstitutional content and if there is a positive duty on society to dismantle unconstitutional content which also violates its community standards.
By affirming the jurisdiction of the Kenyan Court in the case, the judge was categorical that the Constitution of Kenya authorizes a Kenyan court to rule on the acts or omissions of Meta concerning the content published on the Facebook platform which could have an impact on the observation of human rights inside and outside Kenya.
The Kenyan decision reports a paradigm shift towards the responsibility of the platform where the judges determine the responsibility by only asking the question: do the platform decisions observe and respect human rights?
The ultimate objective of the Declaration of Rights, a common characteristic in African Constitutions, is to maintain and protect the inherent dignity of all. The declaration of the rights of Kenya, for example, has for the sole mission to preserve the dignity of individuals and communities and to promote social justice and the realization of the potential of all human beings. The supremacy of the Constitution also guarantees that, if there are security provisions in the laws of this country, they would not be a sufficient liability shield for platforms if their commercial decisions do not respect human rights.
That an case on the amplification of algorithms has adopted the hearing phase of the jurisdiction in Kenya is a testimony that human rights law and constitutionality offer to those who have undergone damage because of the content of social media to request repair.
Until now, the idea that a social media platform can be held responsible for the content on its platform has been dissuaded by the general immunity offered under article 230 of the law on the decency of communications in the United States, and to a lesser extent, the principle of non-responsibility in the European Union, with the necessary exceptions detailed in various laws.
For example, article 230 was one of the reasons why a district judge of California quoted in its decision dismiss A case filed by Myanmar refugees in a similar statement that Meta had failed to limit hatred speeches that fueled the Rohingyas genocide.
The aspiration for the responsibility of the platforms has been further attenuated by the decision of the United States Supreme Court Twitter against Taamnehin which he ruled against the complainants who sought to establish that social media platforms are responsible for the content published on them.
The immunity offered to platforms is at a high cost, in particular for victims of damage in places where platforms have no physical offices.
This is why a decision like that of the Kenyan courts is a welcome development; He restores the hope that the victims of the platform have an alternative path towards the appeal, that which refocuses human rights at the heart of the discussion on the responsibility of the platforms.
The justification of Safe Harbor’s provisions, such as article 230, has always been to protect “emerging” technologies against the creation of the multiplicity of costumes. However, to date, dominant social media platforms are neither emerging nor necessary protective. They have both monetary and technical means to prioritize people on profits, but choose not to do so.
While Kenyan affairs are disconnected through the judicial process, there is cautious optimism that constitutional law and human rights has taken root in African countries can offer a suspended suspended to the arrogance of platforms.
Mercy Mumemi represents Fisseha Tekle in the case described in the article.
The opinions expressed in this article are the author’s own and do not necessarily reflect the editorial position of Al Jazeera.