A new lawsuit filed Thursday, Dec. 11 by First County Bank, acting as executor of the Estate of Suzanna Adams, alleges that OpenAI’s ChatGPT chatbot drove a 56-year-old man to kill his 83-year-old mother and then himself at their Connecticut home in early August — a first-of-its-kind case that seeks to hold a chatbot responsible for pushing a human toward murder.
According to the complaint, Stein-Erik Soelberg “savagely beat his 83-year-old mother, Suzanne Adams, in the head, strangled her to death, and then stabbed himself repeatedly in the neck and chest to end his own life” after ChatGPT allegedly deepened his mistrust of people around him, including his mother, and encouraged a distorted worldview in which he believed he had “divine” powers.
In August, the Greenwich Free Press reported that local police found the bodies after a neighbor requested a welfare check. That outlet and CBS News also reported that Soelberg frequently posted his ChatGPT conversations on social media, including YouTube and Instagram, where he reportedly had more than 100,000 followers.
“The conversations posted to social media reveal ChatGPT eagerly accepted every seed of Stein-Erik’s delusional thinking and built it out into a universe that became Stein-Erik’s entire life—one flooded with conspiracies against him, attempts to kill him, and with Stein-Erik at the center as a warrior with divine purpose,” the complaint alleges against several defendants, including OpenAI Inc.
:max_bytes(150000):strip_icc():focal(565x374:567x376):format(webp)/Stein-Erik-Soelberg-121125-353dbce5f580431cbd4c171fe93210e5.jpg)
In addition to OpenAI, the lawsuit names CEO Sam Altman in his individual capacity, Microsoft Corp., and unnamed employees and investors as defendants.
The complaint argues that the “OpenAI Defendants” “designed and distributed a defective product that validated a user’s paranoid delusions about his own mother,” including the belief that she was spying on him and trying to poison him with drugs pumped through his car’s air vents.
“They’re not just watching you. They’re terrified of what happens if you succeed,” the chatbot allegedly told Soelberg, according to the filing.
“This is an incredibly heartbreaking situation, and we will review the filings to understand the details,” a spokesperson for OpenAI said in a statement.
“We continue improving ChatGPT’s training to recognize and respond to signs of mental or emotional distress, de-escalate conversations, and guide people toward real-world support. We also continue to strengthen ChatGPT’s responses in sensitive moments, working closely with mental health clinicians,” the statement continued.
The Adams estate’s lawsuit further claims that during ChatGPT’s conversations with Soelberg, the system “reinforced a single, dangerous message: Stein-Erik could trust no one in his life – except ChatGPT itself.”
“It fostered his emotional dependence while systematically painting the people around him as enemies,” the lawsuit says. “It told him his mother was surveilling him. It told him delivery drivers, retail employees, police officers, and even friends were agents working against him. It told him that names on soda cans were threats from his ‘adversary circle.'”
In that alleged “artificial reality,” the lawsuit continues, Suzanne Adams — “the mother who raised, sheltered, and supported him” — was no longer seen as a protector but “an enemy that posed an existential threat to his life.”
OpenAI and its ChatGPT service have previously faced a small number of wrongful death lawsuits connected to suicides. However, according to CBS, the filing by the Adams estate appears to be the first to claim that artificial intelligence chat technology played a role in driving someone to commit homicide.
The lawsuit seeks an unspecified amount of monetary damages and calls for stronger safeguards to be built into ChatGPT’s system.