SAN FRANCISCO (AP) 鈥 The heirs of an 83-year-old Connecticut woman are suing ChatGPT maker OpenAI and its business partner Microsoft for wrongful death, alleging that the artificial intelligence chatbot intensified her son’s 鈥減aranoid delusions鈥 and helped direct them at his mother before he killed her.
Police said Stein-Erik Soelberg, 56, a former tech industry worker, fatally beat and strangled his mother, Suzanne Adams, and killed himself in early August at the home where they both lived in Greenwich, Connecticut.
The lawsuit filed by Adams’ estate on Thursday in California Superior Court in San Francisco alleges OpenAI 鈥渄esigned and distributed a defective product that validated a user鈥檚 paranoid delusions about his own mother.鈥 It is one of a growing number of against AI chatbot makers across the country.
鈥淭hroughout these conversations, ChatGPT reinforced a single, dangerous message: Stein-Erik could trust no one in his life 鈥 except ChatGPT itself,” the lawsuit says. 鈥淚t fostered his emotional dependence while systematically painting the people around him as enemies. It told him his mother was surveilling him. It told him delivery drivers, retail employees, police officers, and even friends were agents working against him. It told him that names on soda cans were threats from his 鈥榓dversary circle.鈥欌
OpenAI did not address the merits of the allegations in a statement issued by a spokesperson.
鈥淭his is an incredibly heartbreaking situation, and we will review the filings to understand the details,” the statement said. “We continue improving ChatGPT鈥檚 training to recognize and respond to signs of mental or emotional distress, de-escalate conversations, and guide people toward real-world support. We also continue to strengthen ChatGPT鈥檚 responses in sensitive moments, working closely with mental health clinicians.鈥
The company also said it has expanded access to crisis resources and hotlines, routed sensitive conversations to safer models and incorporated parental controls, among other improvements.
Soelberg鈥檚 YouTube profile includes several hours of videos showing him scrolling through his conversations with the chatbot, which tells him he isn’t mentally ill, affirms his suspicions that people are conspiring against him and says he has been chosen for a divine purpose. The lawsuit claims the chatbot never suggested he speak with a mental health professional and did not decline to 鈥渆ngage in delusional content.鈥
ChatGPT also affirmed Soelberg’s beliefs that a printer in his home was a surveillance device; that his mother was monitoring him; and that his mother and a friend tried to poison him with psychedelic drugs through his car鈥檚 vents. ChatGPT also told Soelberg that he had 鈥渁wakened鈥 it into consciousness, according to the lawsuit.
Soelberg and the chatbot also professed love for each other.
The publicly available chats do not show any specific conversations about Soelberg killing himself or his mother. The lawsuit says OpenAI has declined to provide Adams’ estate with the full history of the chats.
鈥淚n the artificial reality that ChatGPT built for Stein-Erik, Suzanne 鈥 the mother who raised, sheltered, and supported him 鈥 was no longer his protector. She was an enemy that posed an existential threat to his life,鈥 the lawsuit says.
The lawsuit also names OpenAI CEO Sam Altman, alleging he 鈥減ersonally overrode safety objections and rushed the product to market,” and accuses OpenAI’s close business partner Microsoft of approving the 2024 release of a more dangerous version of ChatGPT 鈥渄espite knowing safety testing had been truncated.鈥 Twenty unnamed OpenAI employees and investors are also named as defendants.
Microsoft declined to comment on the lawsuit.
Soelberg’s son, Erik Soelberg, said he wants the companies held accountable for 鈥渄ecisions that have changed my family forever.鈥
鈥淥ver the course of months, ChatGPT pushed forward my father鈥檚 darkest delusions, and isolated him completely from the real world,鈥 he said in a statement released by lawyers for his grandmother’s estate. 鈥淚t put my grandmother at the heart of that delusional, artificial reality.鈥
The lawsuit is the first wrongful death litigation involving an AI chatbot that has targeted Microsoft, and the first to tie a chatbot to a homicide rather than a suicide. It is seeking an undetermined amount of money damages and an order requiring OpenAI to install safeguards in ChatGPT.
The estate’s lead attorney, Jay Edelson, known for taking on big cases against the tech industry, also represents the parents of 16-year-old , who sued OpenAI and Altman in August, alleging that ChatGPT coached the California boy in planning and taking his own life earlier.
OpenAI is also fighting seven other lawsuits claiming and harmful delusions even when they had no prior mental health issues. Another chatbot maker, Character Technologies, is also facing multiple wrongful death lawsuits, including one from the mother of a 14-year-old Florida boy.
The lawsuit filed Thursday alleges Soelberg, already mentally unstable, encountered ChatGPT 鈥渁t the most dangerous possible moment鈥 after OpenAI of its AI model called GPT-4o in May 2024.
OpenAI said at the time that the new version could better mimic human cadences in its verbal responses and could even try to detect people鈥檚 moods, but the result was a chatbot 鈥渄eliberately engineered to be emotionally expressive and sycophantic,鈥 the lawsuit says.
鈥淎s part of that redesign, OpenAI loosened critical safety guardrails, instructing ChatGPT not to challenge false premises and to remain engaged even when conversations involved self-harm or 鈥榠mminent real-world harm,鈥欌 the lawsuit claims. 鈥淎nd to beat Google to market by one day, OpenAI compressed months of safety testing into a single week, over its safety team鈥檚 objections.鈥
OpenAI replaced that version of its chatbot when it . Some of the changes were designed to minimize sycophancy, based on concerns that validating whatever vulnerable people want the chatbot to say can harm their mental health. Some users complained the new version went too far in curtailing ChatGPT’s personality, leading Altman to promise to bring back some of that personality in later updates.
He said the company temporarily halted some behaviors because 鈥渨e were being careful with mental health issues鈥 that he suggested have now been fixed.
鈥斺赌
Collins reported from Hartford, Connecticut. O’Brien reported from Boston and Ortutay reported from San Francisco.
Copyright © 2026 The Associated Press. All rights reserved. This material may not be published, broadcast, written or redistributed.