Meta announces the end of the "blanket ban" on the term (martyr), with a condition!

Meta, the owner of Facebook, Instagram, and Threads, announced the end of the blanket ban on the term (martyr) across its platforms, allowing its use as long as it is not associated with content that violates others' rights or includes references to violence.

The company stated that this announcement comes as part of its commitment to fully implement five out of seven recommendations from the Oversight Board and partially implement the remaining two.

The Oversight Board is an independent body that people can appeal to if they disagree with Meta's decisions regarding content on its platforms.

The company explained that the board recommended allowing the use of the word "martyr" in all cases unless the content violates its policy or is shared with one or more of the following three references to violence:

  • Visual depiction of weapons
  • Declaration of intent or call to use or carry weapons
  • Reference to a specific event.

In its statement today, Tuesday, July 2, 2024, the Oversight Board responded to the company's announcement by saying that Meta had imposed a ban on the term (shaheed) after translating it to English (martyr) and had been immediately deleting it for a long time, considering it referred to entities mentioned in the Dangerous Organizations and Individuals (DOI) policy. However, the term has multiple meanings, many of which do not aim to glorify or approve violence.

The board added:

"Until now, no exceptions were allowed regarding reporting, discussing, or neutrally condemning this term, leading to the unfair deletion of content from millions of users, particularly from Arabic-speaking and Islamic communities."

Under Meta's Dangerous Organizations and Individuals (DOI) policy, the company identifies and bans from its platforms "organizations or individuals that proclaim a violent mission or engage in acts of violence," such as terrorists or hate groups. The company also bans content that includes "praise, substantive support, or representation" of these designated organizations and individuals, whether living or deceased.

Meta treats the term (martyr) as explicit praise when used to refer to a specific individual and removes such content when aware of it.

Meta clarified that it:

"Requested guidance from the Oversight Board regarding this approach, as, although developed with safety in mind, we know it comes with global challenges. The term 'martyr' is used in various ways by many communities worldwide and across cultures, religions, and languages. At times, this approach might lead to the broad removal of content that never intended to support terrorism or praise violence."

Last year, Meta sought the Oversight Board's opinions on three options and considered them:

  1. Maintaining the status quo.
  2. Allowing content using the word "martyr" to refer to a specific dangerous individual only when (a) used in a permitted context (e.g., news reporting, neutral and academic discussion), (b) there is no additional praise, substantive support, or representation of a dangerous organization or individual, and (c) there are no references to violence in the content (e.g., depiction of weapons, military attire, or real-world violence).
  3. Removing content using the word "martyr" to refer to a specific dangerous individual only when there is additional praise, substantive support, or representation, or references to violence.