Why we must demand accountability from Meta over Palestine


The conversation around social media platforms and the companies that own them must shift toward accountability. For too long, tech companies have escaped accountability despite mounting evidence that online platforms are actively violating digital rights and contributing to harm, particularly in the case of Palestine, where this harm has been persistent, systemic, and increasingly dangerous as it coincides with unprecedented real-world violence being inflicted.

Over the past several years, 7amleh has documented and shown repeatedly how Meta platforms have failed to uphold even the most basic standards of human rights. From enabling the spread of incitement and hate speech in Hebrew to disproportionately censoring Palestinian voices , the pattern we observed was confirmed. At its core, the issue with Meta’s failures was structural: discriminatory policies and unfair enforcement practices were at the centre of the problem.

Building upon our mountain of already established evidence, 7amleh’s recent findings pointed to something even more alarming. Our latest research showed that Meta is not only allowing harmful content to circulate but also financially rewarding it . Through its monetisation programs, Meta enabled pages that promote settler violence, extremist incitement, and illegal settlement activity in occupied Palestinian and Syrian territories to generate profit. This includes content that, by the company’s own policies, should be ineligible for monetisation.

Monetisation, in this context, refers to page administrators' ability to earn revenue directly from content through advertisements, engagement-based payouts, and other financial tools offered by the platform. In theory, this system is governed by strict policies that prohibit profit from harmful or illegal content. In practice, however, those safeguards are failing at scale.

Pages promoting settlement expansion, an activity widely recognised as illegal under international law, have been found benefiting from these monetisation tools. Others engage in explicit incitement or extremist rhetoric, yet continue to generate revenue without consequence.

What we see here is a structural failure, once more raising serious questions about Meta’s role in enabling and sustaining harm against Palestinians amidst ongoing genocide and ethnic cleansing.

At the same time, Palestinian and Arabic-language content faces systematic exclusion from these same monetisation opportunities. Independent media outlets, such as Arab48, have struggled to access or maintain monetisation, despite operating within journalistic standards. This disparity reflects a two-tiered dimension of digital discrimination, where Palestinian voices are both suppressed and economically disadvantaged on the very platforms that claim to offer equal opportunity.

Again and again, we point out that Meta, parent company to the world’s most widely used social media platforms, failed to meet its responsibilities under the United Nations Guiding Principles on Business and Human Rights . These principles are clear: companies must avoid causing or contributing to human rights abuses and must act when harm occurs. Meta has done neither.

In fact, the company has repeatedly failed to implement meaningful safeguards, despite years of warnings from civil society, its own Oversight Board , and independent human rights assessments, including the 2022 report conducted by Business for Social Responsibility.

Commitments were made to improve content moderation, particularly in Hebrew, and to address systemic bias. Yet the evidence shows that harmful content continues to proliferate, often unchecked, and is now, in some cases, being monetised.

In the context of an ongoing genocide in Gaza, when online platforms amplify dehumanisation, enable incitement, and allow financial gain from harmful activity, they indicate major failures that take on a far more serious dimension, considering the platforms have become part of the infrastructure through which violence is normalised and sustained. At a minimum, Meta must answer for this. The company must conduct an immediate and transparent audit of its monetisation systems, particularly in Israel and the occupied territories. It must identify and suspend accounts that violate its own policies, and ensure that no financial incentives are attached to harmful or illegal content. More broadly, it must address the structural discrimination embedded in its moderation and monetisation frameworks.

But accountability cannot stop at voluntary action. Regulators, policymakers, and international bodies must step in to ensure that platforms operating at this scale are held to enforceable standards. When corporate systems contribute to real-world harm, there must be consequences.

We have established whether or not Meta’s platforms are causing harm. However, the question now is whether the company and the governments that regulate it are willing to act because moments like this exacerbate the harm and require urgent intervention and a move towards accountability. Jalal Abukhater is a Palestinian writer and human rights defender based in Jerusalem. He is currently the Policy Manager at 7amleh - The Arab Center for the Advancement of Social Media. Have questions or comments? Email us at: editorial-english@newarab.com Opinions expressed in this article remain those of the author and do not necessarily represent those of The New Arab, its editorial board or staff.

Published: Modified: Back to Voices