Israel-Hamas Conflict Sparks Meta Oversight Board’s First Emergency Case

Israel-Hamas Conflict Sparks Meta Oversight Board’s First Emergency Case

Today, Meta’s Oversight Board announced it would take on two expedited cases, the first ever, both dealing with the ongoing conflict between Israel and Hamas. The case will look at two posts that were initially removed from and then reinstated on Instagram and Facebook for violating Meta’s policies against sharing graphic imagery and depicting dangerous organizations and individuals, respectively. One of the posts showed the aftermath of the attack on Al-Shifa Hospital by the Israel Defense Forces, and the other was a video of an Israeli hostage being taken by Hamas on October 7.

“The current Israel–Hamas conflict marks a major event where Meta could have applied some of the board’s more recent recommendations for crisis response, and we are evaluating how the company is following through on its commitments,” Thomas Hughes, director of the Oversight Board Administration, told WIRED. “We see this as an opportunity to scrutinize how Meta handles urgent situations.”

Earlier this year, the board announced it would take on “expedited cases” in what it called “urgent situations.”

The company has been critiqued for how it has handled content around the conflict. In October, 404 Media reported that Meta’s AI was translating the word “Palestinian” into “Palestinian terrorist,” and many Palestinians believe that their content has been suppressed, “shadow-banned,” or removed altogether.

Meta, like many social media platforms, uses a combination of automated tools and a stable of human content moderators—many of them outsourced—to decide whether a piece of content violates the platform’s rules. The company also maintains a list of what it calls “dangerous organizations and individuals”—which includes organizations and names like the Islamic State, Hitler, the Ku Klux Klan, and Hamas. Sabhanaz Rashid Diya, a former member of Meta’s policy team and the founding director of the Tech Global Institute, a tech policy think tank, told WIRED that an automated system often won’t be able to tell the difference between posts discussing or even condemning Hamas, as opposed to ones expressing support.

“There’s the question of whether even the very mention of Hamas is sufficient for it to lead to further adverse actions or not,” Diya says.

Following the 2021 conflict between Israel and Palestine, a human rights due diligence report requested by the Oversight Board and released in 2022 found that the company had both over- and under-enforced some of its own policies, meaning that, at times, content that should have been removed was left up, and content that didn’t violate the platform’s policies was removed anyway. In particular, researchers found “Arabic content had greater over-enforcement,” meaning it was more likely that content in Arabic would be taken down by Meta’s automated content moderation systems.

>>> Read full article>>>
Copyright for syndicated content belongs to the linked Source : Wired – https://www.wired.com/story/oversight-board-meta-israel-hamas/

Exit mobile version