The social media platform X, formerly known as Twitter, says it is trying to take action on a flood of posts sharing graphic media, violent speech and hateful conduct about the war between Israel and Hamas.
X says it’s treating the crisis with its highest level of response. But outside watchdog groups say misinformation about the war abounds on the platform that billionaire Elon Musk bought last year.
That includes continuing a policy frequently championed by Musk of letting users help rate what might be misinformation, which causes those posts to include a note of context but not disappear from the platform.
The struggle to identify reliable sources for news about the war was exacerbated over the weekend by Musk, who on Sunday posted the names of two accounts he said were “good” for “following the war in real-time.” Analyst Emerson Brooking of the Atlantic Council called one of those accounts “absolutely poisonous.” Journalists and X users also pointed out that both accounts had previously shared a fake AI-generated image of an explosion at the Pentagon, and that one of them had posted numerous antisemitic comments in recent months. Musk later deleted his post.
Brooking said Tuesday that it is “significantly harder to determine ground truth in this conflict as compared to Russia’s invasion of Ukraine” last year and “Elon Musk bears personal responsibility for this.”
Brooking posted on X that Musk had enabled fake war reporting by abandoning the blue check verification system for trusted accounts and allowing anyone to buy a blue check.
“War is always a cauldron of tragedy and disinformation; Musk has made it worse,” he added. Further, Brooking said via email “Musk has repeatedly and purposefully denigrated the idea of an objective media, and he made platform design decisions that undermine such reporting. We now see the result.”
Part of Musk's drastic changes over the past year included gutting its staff, including many of the people responsible for moderating toxic content and harmful misinformation.
One former member of Twitter’s public policy team said the company is having a harder time taking action on posts that violate its policies because there aren’t enough people to do that work.
“The layoffs are undermining the capacity of Twitter’s trust and safety team, and associated teams like public policy, to provide needed support during a critical time of crisis,” said Theodora Skeadas, one of thousands of employees who lost their jobs in the months after Musk bought the company.
The company says it is also removing newly created Hamas-affiliated accounts and working with other tech companies to try to prevent “terrorist content” from being distributed online. The company said it is “also continuing to proactively monitor for antisemitic speech as part of all our efforts. Plus we’ve taken action to remove several hundred accounts attempting to manipulate trending topics.”
Linda Yaccarino, whom Elon Musk named in May as the top executive at X, withdrew from an upcoming three-day tech conference where she was scheduled to speak, citing the need to focus on how the platform was handling the war.
X says it recently changed one policy over the weekend to enable people to more easily choose whether or not to see sensitive media without the company actually taking down those posts. “X believes that, while difficult, it’s in the public’s interest to understand what’s happening in real time,” its statement says.
—-
Associated Press writer Ali Swenson contributed to this report.