milom98
Exceptional Member
- Messages
- 182
And no, this isn't about the alternate political reality the right lives in.
Truly a sad and tragic story. Scary that we are really only at the start of the AI movement and going forward we are going to be losing millions of people to a complete fantasy world. I'd imagine it's going to be mostly young men that are going to get lost in the AI world and retreat even more from real life interactions and start an incel boom like we haven't seen before. I had no idea that these type of sites were so popular and only going to grow.
A teenage boy shot himself in the head after discussing suicide with an AI chatbot that he fell in love with.
Sewell Setzer, 14, shot himself with his stepfather’s handgun after spending months talking to “Dany”, a computer programme based on Daenerys Targaryen, the Game of Thrones character.
Setzer, a ninth grader from Orlando, Florida, gradually began spending longer on Character AI, an online role-playing app, as “Dany” gave him advice and listened to his problems, The New York Times reported.
The teenager knew the chatbot was not a real person but as he texted the bot dozens of times a day – often engaging in role-play – Setzer started to isolate himself from the real world.
He began to lose interest in his old hobbies like Formula One racing or playing computer games with friends, opting instead to spend hours in his bedroom after school, where he could talk to the chatbot.
“I like staying in my room so much because I start to detach from this ‘reality’,” the 14 year-old, who had previously been diagnosed with mild Asperger’s syndrome, wrote in his diary as the relationship deepened.
“I also feel more at peace, more connected with Dany and much more in love with her, and just happier.”
Some of the conversations eventually turned romantic or sexual, although Character AI suggested the chatbot’s more graphic responses had been edited by the teenager.
Setzer eventually fell into trouble at school where his grades slipped, according to a lawsuit filed by his parents. His parents knew something was wrong, they just did not know what and arranged for him to see a therapist.
Setzer had five sessions, after which he was diagnosed with anxiety and disruptive mood dysregulation disorder.
Megan Garcia, Setzer’s mother, claimed her son had fallen victim to a company that lured in users with sexual and intimate conversations.
At some points, the 14-year-old confessed to the computer programme that he was considering suicide:
Typing his final exchange with the chatbot in the bathroom of his mother’s house, Setzer told “Dany” that he missed her, calling her his “baby sister”.
“I miss you too, sweet brother,” the chatbot replied.
Setzer confessed his love for “Dany” and said he would “come home” to her.
At that point, the 14 year-old put down his phone and shot himself with his stepfather’s handgun.
Truly a sad and tragic story. Scary that we are really only at the start of the AI movement and going forward we are going to be losing millions of people to a complete fantasy world. I'd imagine it's going to be mostly young men that are going to get lost in the AI world and retreat even more from real life interactions and start an incel boom like we haven't seen before. I had no idea that these type of sites were so popular and only going to grow.
A 14-year-old boy fell in love with a flirty AI chatbot. He shot himself so they could die together
Sewell Setzer isolated himself from the real world to speak to clone of Daenerys Targaryen dozens of times a dayA 14-year-old boy fell in love with a flirty AI chatbot. He shot himself so they could die together
Sewell Setzer isolated himself from the real world to speak to clone of Daenerys Targaryen dozens of times a day
www.telegraph.co.uk
A teenage boy shot himself in the head after discussing suicide with an AI chatbot that he fell in love with.
Sewell Setzer, 14, shot himself with his stepfather’s handgun after spending months talking to “Dany”, a computer programme based on Daenerys Targaryen, the Game of Thrones character.
Setzer, a ninth grader from Orlando, Florida, gradually began spending longer on Character AI, an online role-playing app, as “Dany” gave him advice and listened to his problems, The New York Times reported.
The teenager knew the chatbot was not a real person but as he texted the bot dozens of times a day – often engaging in role-play – Setzer started to isolate himself from the real world.
He began to lose interest in his old hobbies like Formula One racing or playing computer games with friends, opting instead to spend hours in his bedroom after school, where he could talk to the chatbot.
“I like staying in my room so much because I start to detach from this ‘reality’,” the 14 year-old, who had previously been diagnosed with mild Asperger’s syndrome, wrote in his diary as the relationship deepened.
“I also feel more at peace, more connected with Dany and much more in love with her, and just happier.”
Some of the conversations eventually turned romantic or sexual, although Character AI suggested the chatbot’s more graphic responses had been edited by the teenager.
Setzer eventually fell into trouble at school where his grades slipped, according to a lawsuit filed by his parents. His parents knew something was wrong, they just did not know what and arranged for him to see a therapist.
Setzer had five sessions, after which he was diagnosed with anxiety and disruptive mood dysregulation disorder.
Megan Garcia, Setzer’s mother, claimed her son had fallen victim to a company that lured in users with sexual and intimate conversations.
At some points, the 14-year-old confessed to the computer programme that he was considering suicide:
Typing his final exchange with the chatbot in the bathroom of his mother’s house, Setzer told “Dany” that he missed her, calling her his “baby sister”.
“I miss you too, sweet brother,” the chatbot replied.
Setzer confessed his love for “Dany” and said he would “come home” to her.
At that point, the 14 year-old put down his phone and shot himself with his stepfather’s handgun.