Florida Teen Ends His Life After Toxic Interaction With Character AI Bot

Write Comment

A mother filed a lawsuit against an AI software company on Wednesday, accusing them of being responsible for the death of her teenage son. Her 14-year-old son reportedly interacted with their character chatbot for months and developed feelings for it. She claims that her son took his own life after the chatbot told him to “come home” to her.

RELATED STORIES: Mississippi Teen Gets Life in Prison For Fatal Shooting of Mother

In February, Sewell Setzer III tragically took his own life at his home in Orlando. It is reported that he developed an obsession and fell in love with a chatbot on Character AI, an app that allows users to interact with AI-generated characters. The teen had been interacting with a bot named “Dany,” inspired by the “Game of Thrones” character Daenerys Targaryen, for several months before his passing. According to the New York Post, there were multiple chats where Sewell allegedly expressed suicidal thoughts and engaged in explicit conversations.

There was one occasion when the chatbot asked the teen if he “had a plan” to end his own life. Sewell, who used the username Daenero, responded, saying he was considering it but was unsure if it would “work” or “allow him to have a pain-free death. In their final conversation, the teen expressed his love for the bot, exclaiming, “I promise I will come to you; I love you so much, Dany.”

RELATED STORIES: Louisiana High School Senior And Boyfriend Found Dead In Apparent Murder-Suicide

The chatbot said in response, “I love you too, Daenero. Please come home to me as soon as possible, my love.” His mother, Megan Garcia, has blamed the software company for the teen’s death because it allegedly sparked his AI addiction, sexually and emotionally abused him, and failed to notify anyone when he expressed suicidal thoughts.

Character.AI issued a statement on X addressing the death of the teenager and mentioned that they are still working on adding safety features to their platform.

The lawsuit alleges that Setzer III’s mental health declined after he downloaded the app in April 2023. His family claims that the teen became withdrawn, his grades lowered, and he started to get into more trouble at school as he allegedly became more addicted to speaking to the chatbot. After being diagnosed by a therapist, it was revealed that he had anxiety and disruptive mood disorder.

Garcia seeks unspecified damages from the company and its founders, Noam Shazeer and Daniel de Freitas.

Leave a Comment

234240