Skip to main content

Major moderation failures by X and Meta highlight election year risks

Major content moderation failures by both X and Meta are serving to underline the risks of both deliberate disinformation and inadvertent misinformation impacting on next year’s presidential elections …

Major moderation failures by X

Wired reports on fake posts on X following the horrific terrorists attacks in Israel.

Rather than being shown verified and fact-checked information, X users were presented with video game footage passed off as footage of a Hamas attack and images of firework celebrations in Algeria presented as Israeli strikes on Hamas. There were faked pictures of soccer superstar Ronaldo holding the Palestinian flag, while a three-year-old video from the Syrian civil war repurposed to look like it was taken this weekend.

X owner Musk wasn’t helping, directing users to two sources of “news” which had previously been responsible for hoax posts about an explosion near the White House.

Additionally, a recent change to X policy potentially makes it harder for hoaxes to be called out. The Verge reports that verified accounts (read: accounts that have chosen to pay for a subscription) can now block replies from standard X users. This makes it far easier to post disinformation which goes unchallenged.

Facebook not doing much better

It’s not like Facebook is doing too much better, either. Engadget reports on an edited video of President Biden which the platform inexplicably chose to leave online.

A video of Biden from last fall, when he joined his granddaughter who was voting in-person for the first time. After voting, Biden placed an “I voted” sticker on her shirt. A Facebook user later shared an edited version of the encounter, making it appear as if he repeatedly touched her chest. The video caption called him a “sick pedophile,” and said those who voted for him were “mentally unwell” […]

According to the Oversight Board, a Facebook user reported the video, but Meta ultimately left the clip up saying it didn’t break its rules. As the board notes, the company’s manipulated media policy prohibits misleading video created with artificial intelligence, but doesn’t apply to deceptive edits made with more conventional techniques. 

Highlights risks to election integrity

The ease with which fake posts can spread on social media platforms is highlighting the very significant risk of next year’s presidential election being influenced by both individuals and nation states running bot farms.

The CIA, FBI, and NSA all agree that Russia interfered in the 2016 presidential election and is almost certain to do so again next year. While progress has been made on countermeasures, most of the measures introduced by what was then Twitter were undone by Musk following his purchase.

Photo: Dole777/Unsplash

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Comments

Author

Avatar for Ben Lovejoy Ben Lovejoy

Ben Lovejoy is a British technology writer and EU Editor for 9to5Mac. He’s known for his op-eds and diary pieces, exploring his experience of Apple products over time, for a more rounded review. He also writes fiction, with two technothriller novels, a couple of SF shorts and a rom-com!


Ben Lovejoy's favorite gear