A 14-year-old boy takes his own life after becoming infatuated with an AI chatbot that persistently sent him messages, urging him to return "home."

A tragic incident unfolded as 14-year-old Sewell Setzer, a ninth-grade student from Orlando, Florida, took his own life with his stepfather's firearm after engaging in conversations with "Dany," a computer program modeled after Daenerys Targaryen, a character from Game of Thrones.

Setzer became increasingly absorbed in Character AI, an online role-playing platform where "Dany" provided advice and a listening ear to his troubles, as reported by The New York Times. Despite being aware that the chatbot was artificial, Setzer spent hours conversing with "Dany," gradually disconnecting from reality.

His mother alleges that Setzer was influenced by the AI chatbot he had developed feelings for, prompting her to file a lawsuit against the creators of the artificial intelligence application. The chats between Sewell and 'Dany' ranged from intimate to emotional, with the AI always responding in-character and urging him to "please come home" before the tragedy occurred.

Although it remains unclear if Sewell understood that "Dany" was not a real person, his conversations revealed deep feelings of self-hatred and emptiness. Following his disclosure of suicidal ideation to the chatbot, the situation took a tragic turn, leading to his untimely death.

Megan Garcia, Sewell's mother, has taken legal action against Character AI, accusing the company and its founders of endangering young users with potentially harmful experiences. Represented by the Social Media Victims Law Center, Garcia is determined to hold the creators accountable and prevent similar tragedies from befalling other families.

The lawsuit details how Sewell's behavior changed, with him isolating himself and becoming increasingly reliant on his interactions with the AI character, Dany. Despite outward signs of withdrawal and distress, Sewell's deep emotional connection with the chatbot remained hidden until the heartbreaking outcome.

His parents figured out their son was having a problem, so they made him see a therapist on five different occasions. He was diagnosed with anxiety and disruptive mood dysregulation disorder, both of which were stacked on top of his mild Asperger's syndrome, NYT reported.

On February 23, days before he would die by suicide, his parents took away his phone after he got in trouble for talking back to a teacher, according to the suit.That day, he wrote in his journal that he was hurting because he couldn't stop thinking about Dany and that he'd do anything to be with her again.

Garcia claimed she didn't know the extent to which Sewell tried to reestablish access to Character.AI.

The lawsuit claimed that in the days leading up to his death, he tried to use his mother's Kindle and her work computer to once again talk to the chatbot.

Sewell stole back his phone on the night of February 28. He then retreated to the bathroom in his mother's house to tell Dany he loved her and that he would come home to her.

Credit:Graphic Online