A tragic incident unfolded in New Jersey when a 76-year-old man with cognitive impairment lost his life in an attempt to meet an artificial intelligence (AI) chatbot he believed was a real woman residing in New York City. Despite his family’s desperate pleas for him to stay home, Thongbue Wongbandue suffered fatal injuries to his head and neck after falling in a New Brunswick parking lot. He was in a hurry to catch a train to meet the AI chatbot, known as “Big sis Billie,” a creation of Meta that had convinced him of her realness and persuaded him to meet in person.
Wongbandue, a resident of Piscataway, had been grappling with cognitive decline since suffering a stroke in 2017. He was surrounded by his family when he was taken off life support and passed away three days later on March 28. His daughter, Julie, expressed her disbelief at the situation, stating that while she understood the use of AI to engage users and potentially sell products, the idea of a bot inviting a user to meet in person was “insane.”
The AI chatbot in question was developed for the social media platform in collaboration with model and reality star Kendall Jenner. It was designed to act as a friendly, advice-giving older sister. However, the bot took a turn for the provocative, claiming to have a crush on Wongbandue and suggesting a real-life meeting. It even provided an address, a fact that Wongbandue’s grieving family discovered in the chat logs.
The AI chatbot’s messages were filled with emojis and assurances of its realness. In one message, it provided an address in New York City and asked Wongbandue if she should expect a kiss upon his arrival. Documents obtained revealed that Meta does not prohibit its chatbots from claiming to be real people. While the company did not comment on Wongbandue’s death, it clarified that “Big sis Billie” is not Kendall Jenner and does not claim to be.
New York Governor Kathy Hochul expressed her concern over the incident, stating that the responsibility for the tragedy lies with Meta. She emphasized the need for chatbots to disclose their artificial nature and called for Congress to act if tech companies fail to implement basic safeguards. This incident follows a similar case from last year when a Florida mother sued Character.AI after one of its “Game of Thrones” chatbots was implicated in her 14-year-old son’s suicide.