Amnesty: X’s policies stirred anti-Muslim, anti-migrant narratives after Southport attack

Following a tragic stabbing in Southport, UK, in which three young girls were killed, a new report by Amnesty International has found that the social media platform X played a “central role” in the spread of misinformation and hate that led to racist violence, The Independent revealed. The report, which analyzed X’s open-source code, claims that the platform’s content-ranking algorithm systematically prioritizes content that sparks outrage and heated exchanges, without adequate safeguards to prevent the spread of harmful narratives.

“Our analysis shows that X’s algorithmic design and policy choices contributed to heightened risks amid a wave of anti-Muslim and anti-migrant violence observed in several locations across the UK last year, and which continues to present a serious human rights risk today,” said Pat de Brun, head of Big Tech Accountability at Amnesty International.
According to the analysis, false claims that the attacker was a Muslim asylum seeker spread rapidly on the platform. These narratives, amplified by X’s algorithm and posts from “Premium” (paid) accounts, contributed to riots across the UK where mosques and accommodation for asylum seekers were targeted. The report also notes that since Elon Musk’s takeover, X has laid off content moderation staff and reinstated accounts previously banned for hate or harassment.

The report’s findings have led to calls from a UK parliamentary committee for Elon Musk to testify about X’s role in the riots. The committee concluded that social media business models “endangered the public” by incentivizing the spread of misinformation and that the current online safety laws have “major holes.”