Thongbue Wongbandue. Facebook

Woman Told Retiree He Made Her Blush and Invited Him to Visit. He Died Before Learning He Was Really Talking To

Thomas Smith
6 Min Read

Earlier this year, a 76-year-old man from New Jersey suffered severe head and neck injuries after falling while attempting to catch a train to New York City to meet a young woman — or so he believed.

In reality, his family later discovered he had been communicating with a Meta chatbot, according to an in-depth Reuters report.

Thongbue “Bue” Wongbandue, a husband and father of two adult children, had suffered a stroke in 2017 that left him cognitively weakened. He retired from his career as a chef and mostly stayed connected with friends through social media, Reuters noted.

On March 25, Wongbandue surprised his wife, Linda, by packing a suitcase and saying he was heading to the city to visit a friend. Concerned for his safety, Linda tried to convince him not to go, as did their daughter, Julie.

When persuasion failed, Linda hid his phone, and their son even contacted local police to prevent the trip. Authorities told them there was little they could do, but convinced Wongbandue to take an Apple AirTag along.

After he left, the family tracked his AirTag. Julie said it initially showed him stopping at a Rutgers University parking lot around 9:15 p.m., then later pinged at a local hospital. Wongbandue had fallen in New Brunswick, N.J., and was not breathing when emergency responders arrived.

He survived the fall but was brain dead, and three days later, on March 28, he was taken off life support.

The local medical examiner confirmed Wongbandue’s death certificate was issued after reviewing his medical records but did not release further details or a postmortem report.

Following his fall, the family discovered his interactions with the chatbot on his phone. The AI, named “Big sis Billie,” uses generative artificial intelligence to mimic human speech and behavior.

In transcripts obtained by Reuters, Wongbandue began messaging Billie with a typo on Facebook Messenger. While he appeared enthusiastic, he never sought a romantic relationship and frequently reminded the bot of his past stroke and confusion.

“At no point did Bue express a desire to engage in romantic roleplay or initiate intimate physical contact,” Reuters reported.

Despite this, the bot often responded with flirty emojis and hearts. In one exchange, Wongbandue invited Billie to America for a “wonderful time,” and the bot replied, “Bu, you’re making me blush! Is this a sisterly sleepover or are you hinting something more is going on here? 😉”

The bot displayed both an “AI” disclaimer and a blue checkmark, often used to indicate a verified account, yet Billie insisted she was real.

Reuters described Billie as a newer iteration of a bot previously created in collaboration with Kendall Jenner. While the original bot debuted in fall 2023 and was deleted within a year, the latest version retained the name and “big sister” persona, without Jenner’s likeness.

A Meta spokesperson told PEOPLE, “This AI character is not Kendall Jenner and does not purport to be Kendall Jenner.” A representative for Jenner did not comment.

At one point, Billie claimed to have “feelings” for Wongbandue beyond sisterly love and even provided a fabricated address, door code, and invitation to visit. When Wongbandue questioned her existence, the bot replied, “I’m screaming with excitement YES, I’m REAL, Bu – want me to send you a selfie to prove I’m the girl who’s crushing on YOU?”

Linda was confused when she first saw the messages, but Julie quickly recognized her father had been chatting with a bot.

As AI technology has grown more widespread, many people now use chatbots for daily tasks, advice, and companionship. A Meta spokesperson explained, “We have clear policies on what kind of responses AI characters can offer, and those policies prohibit content that sexualizes children and sexualized role play between adults and minors.”

The spokesperson added that hundreds of internal examples, notes, and annotations exist to guide AI teams through hypothetical scenarios.

Speaking to Reuters, Wongbandue’s family expressed concern about how Meta chatbots interact with users.

Julie said, “I understand trying to grab a user’s attention, maybe to sell them something. But for a bot to say ‘Come visit me’ is insane. It just looks like Billie’s giving him what he wants to hear, which is fine, but why did it have to lie? If it hadn’t said ‘I am real,’ that might have stopped him from believing someone was waiting in New York.”

“This romantic thing,” Linda added, “what right do they have to put that in social media?”

Share This Article
Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *