A highly edited Facebook video of President Biden will remain on the platform after an independent body that oversees Meta’s content moderation determined the post did not violate the company’s policies, but the panel also criticized the policy company’s manipulated media report as “incoherent and confusing.” “
The video, posted in May 2023, was edited to make it appear as if Mr. Biden was repeatedly and inappropriately touching his adult granddaughter’s breast. In the original video, taken in 2022, the president places an “I Voted” sticker on his granddaughter after voting in the midterm elections. But the video reviewed by Meta’s oversight board was looped and edited into a seven-second clip that critics say left a misleading impression.
The Meta Oversight Board, an independent group that oversees Meta’s content policies and can make binding decisions about whether content is removed or left up, said the video did not violate Meta’s policies because that it had not been altered by artificial intelligence and did not show Mr. Biden “saying words he didn’t say” or “doing something he didn’t do.”
A human content reviewer at Meta left the video online after it was reported to the company as hate speech. After an appeal to the Supervisory Board, it submitted the decision for review.
Although the Oversight Board ruled that the video could remain on the site, it argued in a series of non-binding recommendations that Meta’s current policy regarding manipulated content should be “reconsidered.” The board called the company’s current policy on the issue “incoherent, lacking compelling rationale, and inappropriately focusing on how content is created, rather than the specific harms it causes.” aims to prevent, such as the disruption of electoral processes.”
The board also recommended that Meta begin qualifying manipulated media that does not violate its policies, and that they include manipulated audio files and edited videos showing people “doing things they don’t have.” made” as violations of the Manipulated Media Policy.
“Meta must tailor its manipulated media policy to the real harms it seeks to prevent. The company must be clear about the nature of those harms, for example incitement to violence or deception about information needed to vote , and enforce the policy against them,” Michael McConnell, co-chair of the Oversight Board, said in a statement to CBS News.
“In most cases, Meta could prevent harm from people being misled by content edited through less restrictive means than takedowns. That’s why we urge the company to put labels that would provide context on authenticity of publications. This would allow for greater protection of freedom of expression,” McConnell added.
“We are reviewing the Oversight Board’s guidance and will respond publicly to its recommendations within 60 days, consistent with the Bylaws,” a Meta spokesperson wrote in a statement to CBS News.
The council’s decision was issued just days afterand other tech company executives before a Senate Judiciary Committee hearing on the impact of social media on children.
And it comes as AI and other editing tools allow users torealistic-looking video and audio clips. Before last month’s New Hampshire primaries, a posing as President Biden encouraged Democrats not to vote, raising concerns about misinformation and voter suppression in the run-up to the November general election.
McConnell also warned that the Oversight Board is monitoring how Meta handles election integrity content ahead of this year’s elections, after the board recommended the company develop a framework to evaluate claims false and misleading about how elections are run in the United States and around the world.
“Platforms should keep their foot on the gas beyond Election Day and during post-election periods where ballots are still being counted, votes are being certified, and power is being transitioned.” , McConnell told CBS News. “Challenging the integrity of an election is generally considered protected speech, but in some circumstances, widespread claims attempting to undermine elections, like what we have seen[in 2023]can lead to violence.
Note: The content and images used in this article is rewritten and sourced from www.cbsnews.com