Story highlights
Meta Platforms announced Tuesday it would lift its blanket ban on the term “shaheed,” meaning “martyr” in English, following a year-long review by its oversight board, citing the company’s approach had been “overbroad”.
Meta Platforms announced Tuesday it would lift its blanket ban on the term “shaheed,” meaning “martyr” in English, following a year-long review by its oversight board, citing the company’s approach had been “overbroad”.
The social media giant, which operates under Facebook and Instagram, has made many headlines and criticism over its content policy in the Middle East. A 2021 study commissioned by Meta itself, revealed such an approach had an “adverse human rights impact” on Palestinian and other Arabic-speaking users of its services. Criticism escalated when hostilities between Israel and Hamas ratcheted up last October.
The oversight board may be paid for by Meta, but it is independent and began its review last year. It found that “shaheed” was responsible for more content removals on Meta’s platforms than any other single word or phrase. A review from March finished with the conclusion that the rules at Meta did not appropriately contemplate the divergent meanings attributed to “shaheed,” which led to the takedown of many pieces of content unnecessarily that did not advocate violence.
On Tuesday, Meta acknowledged the findings of the review. According to the company, its own internal tests indicated that the removal of content where “shaheed” was “paired with otherwise violating content captures the most potentially harmful content without disproportionately impacting the voice”.
The oversight board welcomed the change, saying Meta’s policy had resulted in the censorship of millions of people across its platforms. The change will help create a more inclusive space for Arabic-speaking users and all those talking about sensitive topics.
The decision to lift the blanket ban on “shaheed” is a tectonic change in the content moderation policy of Meta. This change could represent how the company is increasingly recognising that culturally nuanced and context-sensitive approaches to content regulation are needed. Hence, this new policy not only protects the freedom of expression of users but also mitigates possible negative human rights impacts as identified in the 2021 study. This step by Meta sets the precedent for more balanced and inclusive moderation practices of content on its platforms as it refines its policies.