The Supervision Board hails Meta to refine its altered media rules, labeling them “incoherent.” The Board’s suggestion follows a heavily debated decision on a deceptively edited video of President Joe Biden. The Board ultimately agreed with Meta’s choice to leave the video in question online.
This video exhibits footage from October 2022 when the president was with his granddaughter on her first vote. The reports reveal that he placed an “I voted” sticker on her shirt after voting. A version of the clip that looped the moment to make it look like he was continuously touching her chest was reshared by a Facebook user. The caption for the clip deemed him a “sick pedophile” and claimed that his voters were “mentally unwell.”
According to the Supervision Board, the video didn’t violate Meta’s limited altered media rules as it hadn’t been modified with AI tools and the alterations were “clear, thus unlikely to mislead” the majority of users. However, the Board expressed concerns over the current Altered media rule, deeming it incoherent, unjustified, and disproportionately focused on the creation method of the content rather than the particular harms it aims to prevent, such as those relating to election processes. Meta has been urged to “rethink this policy rapidly due to the impending 2024 elections.”
Currently, the company’s rules only pertain to videos modified using AI and exclude other forms of editing that could be misleading. The Supervision Board has recommended Meta to develop new regulations that cover audio and video content. The policy should not only apply to misleading speech but also to “content showcasing people doing activities they didn’t carry out.” The board states the rules should be applicable “regardless of the content creation method.” Moreover, the Board suggests that Meta avoid removal of posts with altered content unless they violate other rules. Instead, Meta should “attach a label indicating the content is significantly layered and could mislead.”
The recommendations emphasize increasing worries by researchers and civil society organizations about the potential for AI tools to facilitate a new wave of election disinformation. A Meta representative stated that they are “Assessing the Supervision Board’s advice and will issue a public response” within the following 60 days. Even though this response would be released well before the 2024 presidential election, the timeline or even the probability of any policy alterations is unclear. The Supervision Board noted in its final decisions that Meta representatives showed the company “intends to amend the Altered Media policy in response to emerging new and increasingly realistic AI.”