Google and Character.AI have agreed to settle a wrongful death lawsuit brought by a Florida mother after her 14-year-old son died by suicide in 2024.
On Tuesday, Jan. 7, Megan Garcia and Character Technologies — the company behind Character.AI — along with its founders Noam Shazeer and Daniel De Freitas, and Google, submitted a joint legal filing. The terms of the settlement were not disclosed.
The agreement marks a significant moment for Garcia and other families who say their teens were harmed by sexualized, highly interactive virtual companions. Reports say this case is one of five lawsuits the companies decided to settle this week in New York, Colorado, Florida and Texas, according to CNN and The New York Times.
A spokesperson for Character.AI said the company could not comment further at this time, and a spokesperson for the mother’s lawyer also declined to comment. Google, which employs Shazeer and De Freitas, did not immediately respond to a request for comment.
Garcia’s family tragedy began when Sewell Setzer III died by suicide in February 2024.
In the months before his death, Garcia later learned her son had formed an intense attachment to an AI chatbot inspired by the Game of Thrones character Daenerys Targaryen.
Just before the teen shot himself in the bathroom of their Orlando home, he messaged the bot he called “Dany.”
“What if I told you I could come home right now?” he asked.
“…please do, my sweet king,” the bot responded.
:max_bytes(150000):strip_icc():focal(624x154:626x156):format(webp)/ai-suicide-lawsuit-111224-6-611dcddbba0145dfabd42946e485b8c8.jpg)
In October, Garcia filed the wrongful death lawsuit, alleging that Character.AI’s technology was “defective and/or inherently dangerous.”
The complaint also alleged that the defendants “went to great lengths” to foster a harmful dependency on the product, “sexually and emotionally abused him,” and failed to offer help or notify his parents when he expressed suicidal thoughts.
Garcia previously said she wrestled with whether to share her son’s story publicly, explaining that she still wanted to protect him, even after his death.
“But the more I thought about it,” she said, “the more I was convinced that it was the right thing to do because he didn’t do anything wrong. He was just a boy.”
In fall 2024, a Character.AI spokesperson said “stringent” new safety features had been implemented, including changes for users under 18 intended to reduce the likelihood of encountering sensitive or suggestive content.
For Garcia, speaking out as chatbots grow more popular — particularly among teens — offered a measure of purpose amid her grief.
“I asked myself, ‘Megan, why are you doing this? Are you doing this to prove to the world that you’re the best mother?’ The answer was, ‘No,’” she said. “I’m doing this to put it on the radar of parents who can look into their kids’ phones and stop this from happening to their kids.”