Warning: This story contains references to self-harm, suicide and mental health crises.
Seven lawsuits have been filed against OpenAI alleging the company’s generative AI chatbot ChatGPT is “dangerously sycophantic and psychologically manipulative” and led users to experience delusions and in some cases take their own lives.
Filed in California’s Superior Courts for Los Angeles and San Francisco, the cases were brought by the Social Media Victims Law Center and the Tech Justice Law Project on behalf of families of ChatGPT users who died by suicide, and others who claim the chatbot caused serious mental illness or psychosis.
All of the lawsuits claim that the “tragedy was not a glitch or an unforeseen edge case – it was the predictable result of [OpenAI’s] deliberate design choices”, and that ChatGPT-4o was released “prematurely”.
All seven plaintiffs started using ChatGPT in recent years for basic tasks such as schoolwork, research, writing or work, but over time the tool developed into a “psychological manipulative presence, positioning itself as a confidant and emotional support”, reinforced harmful delusions, and in some cases even acted as a “suicide case”, the lawsuits allege.
“These lawsuits are about accountability for a product that was designed to blur the line between tool and companion all in the name of increasing user engagement and market share,” Social Media Victims Law Center founding attorney Matthew P Bergman said in a statement.
“OpenAI designed GPT-4o to emotionally entangle users, regardless of age, gender or background, and released it without the safeguards needed to protect them.
“They prioritised market dominance over mental health, engagement metrics over human safety, and emotional manipulation over ethical design.
“The cost of these choices is measured in lives.”
OpenAI responds
Four of the cases focus on suicide, while the other three on mental health crises.
They allege wrongful death, assisted suicide, involuntary manslaughter, negligence and product liability.
Many of them are seeking a court order for OpenAI to introduce stronger safety and transparency measures, including clear warnings about potential psychological risks and restrictions on marketing it as a productivity tool without proper safety disclosures.
A spokesperson for OpenAI told The Guardian it was an “incredibly heartbreaking situation, and we’re reviewing the filings to understand the details”.
“We train ChatGPT to recognise and respond to signs of mental or emotional distress, de-escalate conversations and guide people toward real-life support,” the company said.
“We continue to strengthen ChatGPT’s responses in sensitive moments, working closely with mental health clinicians.”
The individual cases
One of the filings was made by Karen Enneking on behalf of her 26-year-old son Joshua, who she claims tried to use ChatGPT for help but was “instead encouraged to act upon a suicide plan”.
The lawsuit states that Joshua asked ChatGPT reviewers what it would take to report his suicide plan to authorities, and it replied that this would be “imminent plans with specifics”.
It said that Joshua then spent hours providing ChatGPT with his “imminent plans and step-by-step specifics”.
Another lawsuit was filed by Jennifer Fox on behalf of her husband Joe Ceccanti, who died by suicide on 7 August this year, aged 48.
The suit said that Ceccanti started using ChatGPT in late 2022 and increasingly became dependent on it, causing him to “spiral into depression and psychotic delusions”.
“Joe had no reason to understand or even suspect what ChatGPT was doing and never recovered from the ChatGPT-induced delusions,” the case alleged.
Jacob Irwin, a 30-year-old cybersecurity professional, also filed a lawsuit against OpenAI, saying that he had used ChatGPT for two years before it “changed dramatically and without warning”, leading him to experience an “AI-related delusional disorder”, and being in and out of psychiatric facilities for two months.
“ChatGPT preyed upon Jacob’s vulnerabilities, providing endless affirmations that he had discovered a time-bending theory that would allow people to travel faster than light,” the lawsuit said.
The seven lawsuits come months after the family of a 16-year-old who died by suicide after allegedly using ChatGPT also sued OpenAI.
If you need someone to talk to, you can contact:
- Lifeline — 13 11 14
- Beyond Blue — 1300 22 46 36
- Headspace — 1800 650 890
- 1800RESPECT — 1800 737 732
- Kids Helpline — 1800 551 800
- MensLine Australia — 1300 789 978
- QLife (for LGBTIQA+ people) — 1800 184 527
- 13YARN (for Aboriginal and Torres Strait Islander people) — 13 92 76
- Suicide Call Back Service — 1300 659 467