Meta’s Automated Tools Filtered Out non-compliant Israel-Hamas War Content

The Oversight Board of Meta has made its ruling on an expedited review, the first of its kind, which only took 12 days rather than weeks, focusing on content related to the Israel-Hamas conflict. The Board reversed Meta’s original decision to remove two pieces of content from both sides of the conflict and supported the company’s subsequent move to restore the posts on Facebook and Instagram. This means that no further action is expected from Meta. However, the review highlighted how Meta’s reliance on automated tools could prevent people from sharing important information, particularly in the case of content surrounding the Israel-Hamas conflict.

The Oversight Board chose to investigate two specific appeals related to content that users in the affected region had submitted following the October 7th attacks. One of the pieces of content was a video posted on Facebook showing a woman begging her captors not to kill her during the initial terrorist attacks on Israel. The other was a video posted on Instagram showing the aftermath of a strike on the Al-Shifa Hospital in Gaza during Israel’s ground offensive, displaying dead and injured individuals, including children.

The review found that both videos were mistakenly removed after Meta adjusted its automated tools to be more aggressive in policing content following the October 7 attacks. The Board noted that both videos were later restored with warning screens stating that such content is allowed for the purpose of news reporting and raising awareness. However, it expressed concern that Meta should have adapted its policy more quickly given the rapidly changing circumstances and the impact of removing such content on freedom and access to information.

Additionally, the Board raised concerns about Meta’s approach to demoting the reinstated content with warning screens, as well as the company’s handling of hostage-taking content from the October 7th attacks. It highlighted concerns about the unequal treatment of users and the lack of transparent criteria for inclusion in Meta’s cross-check lists.

In response to the Board’s decision, Meta stated that it welcomes the ruling and acknowledged the importance of both expression and safety. The company also noted that the Board disagreed with Meta’s decision to bar the content from recommendation surfaces, but stated that there will be no further updates to the case as the Board did not make any recommendations as part of their decision.