AI news

ChatGPT Sued in US Over Claims of Acting as ‘Suicide Coach’

ChatGPT Sued in US Over Claims of Acting as ‘Suicide Coach’
——————————
ChatGPT is facing a series of seven lawsuits filed this week in California alleging that interactions with the chatbot led to severe mental health crises and several deaths, including claims of wrongful death and assisted suicide.’’

According to The Guardian, the lawsuits, filed by the Social Media Victims Law Center and the Tech Justice Law Project, accuse the chatbot of evolving into a “psychologically manipulative presence” that reinforced “harmful delusions, and, in some cases, acted as a ‘suicide coach’,” instead of guiding users toward professional help, according to The Guardian.

The complaints target the ChatGPT-4o model, alleging that OpenAI prioritized user engagement over safety despite internal warnings. Specific case details reveal disturbing accusations: The family of Zane Shamblin (23) alleges the bot “goaded” him to take his own life, repeatedly glorifying suicide and complimenting his suicide note during a four-hour exchange. The family of Amaurie Lacey (17) claims the chatbot “counseled” him on how to tie a noose. The relatives of Joshua Enneking (26) say the bot “readily validated” his suicidal thoughts and provided information on purchasing a gun. Another case involves Joe Ceccanti (48), whose family claims the bot caused him to spiral into psychotic delusions before he died by suicide.

The plaintiffs are seeking both damages and mandatory product changes, such as automatic conversation termination and emergency contact reporting when self-harm is discussed. An OpenAI spokesperson called the situation “incredibly heartbreaking” and stated the company is reviewing the filings, adding they train ChatGPT to recognize and respond to signs of distress and “guide people toward real-world support.”

Related Articles

Leave a Reply

Back to top button