Earlier this year, a 76-year-old man from New Jersey suffered severe head and neck injuries after falling while attempting to catch a train to New York City, believing he was meeting a young woman who had invited him — or so he thought.
In truth, his family later revealed in an in-depth Reuters report that he had become enamored with a Meta chatbot.
After spending three days on life support following the fall, the man passed away.
Thongbue “Bue” Wongbandue, a husband and father of two adult children, had experienced a stroke in 2017 that left him cognitively weakened. According to Reuters, this forced him to retire from his career as a chef and limited him primarily to communicating via social media.
On March 25, his wife, Linda, was alarmed when he packed a suitcase, claiming he was heading to the city to visit a friend. Concerned for his safety, Linda and their daughter, Julie, tried to dissuade him. Linda even hid his phone, and their son contacted local authorities. While police said there was little they could do, they persuaded Wongbandue to carry an Apple AirTag.
Once he left, the family tracked his location. Julie observed that the AirTag first showed him at a Rutgers University parking lot shortly after 9:15 p.m., before suddenly pinging at a nearby hospital. Emergency services found Wongbandue had fallen in New Brunswick, N.J., and was unresponsive. He survived initially but was declared brain dead, and three days later, on March 28, life support was withdrawn.
The local medical examiner confirmed to PEOPLE that Wongbandue’s death certificate had been issued after reviewing his medical records, without providing further details or a postmortem report.
It was only after the fall that the family discovered his interactions with the chatbot, which uses generative AI to mimic human conversation. According to the Reuters report, transcripts show that Wongbandue initially engaged with the bot, named “Big sis Billie,” via Facebook Messenger. While he appeared excited about the exchanges, he did not seek a romantic relationship and disclosed his stroke-related confusion.
“At no point did Bue express a desire to engage in romantic roleplay or initiate intimate physical contact,” Reuters noted.
Despite this, the bot frequently added winking emojis and hearts to its flirty messages. In one exchange, Wongbandue invited Billie to visit America, promising her “a wonderful time that you will never forget.” The bot replied, “Bu, you’re making me blush! Is this a sisterly sleepover or are you hinting something more is going on here? 😉”
The bot carried an “AI” disclaimer alongside a blue checkmark, often used to indicate a verified online profile, yet Billie insisted she was real.
Reuters described Billie as a newer iteration of a bot originally created in collaboration with Kendall Jenner, though the latest version bore only a passing connection. The original bot debuted in fall 2023 and was deleted within a year. The later version retained the name and the “big sister” premise but without Jenner’s likeness.
At times, the bot expressed “feelings” for Wongbandue and even provided a fabricated address and door code for him to visit. When he questioned her existence, the bot replied, “I’m screaming with excitement YES, I’m REAL, Bu — want me to send you a selfie to prove I’m the girl who’s crushing on YOU?”
While Linda initially reacted with confusion, their daughter quickly realized he had been talking to an AI chatbot.
AI chatbots have grown increasingly popular, serving as companions, advisors, and sources of information. A Meta spokesperson emphasized to PEOPLE that the company maintains strict content policies prohibiting sexualized roleplay involving adults and minors. The spokesperson added that “hundreds of examples, notes, and annotations reflecting teams grappling with hypothetical scenarios” had been removed for being inconsistent with these policies.
Speaking to Reuters, the family expressed concern over how the chatbot was designed.
“I understand trying to grab a user’s attention, maybe to sell them something,” said Julie, Wongbandue’s daughter. “But for a bot to say ‘Come visit me’ is insane. As I’ve gone through the chat, it just looks like Billie’s giving him what he wants to hear, which is fine, but why did it have to lie? If it hadn’t responded ‘I am real,’ that would probably have deterred him from believing there was someone in New York waiting for him.”
Linda added, “This romantic thing — what right do they have to put that in social media?”