Why is Artificial Intelligence delusional?

  • Thread starter Thread starter milom98
  • Start date Start date
  • Replies: 22
  • Views: 534
  • Off-Topic 

milom98

Exceptional Member
Messages
182
And no, this isn't about the alternate political reality the right lives in.

Truly a sad and tragic story. Scary that we are really only at the start of the AI movement and going forward we are going to be losing millions of people to a complete fantasy world. I'd imagine it's going to be mostly young men that are going to get lost in the AI world and retreat even more from real life interactions and start an incel boom like we haven't seen before. I had no idea that these type of sites were so popular and only going to grow.

A 14-year-old boy fell in love with a flirty AI chatbot. He shot himself so they could die together​

Sewell Setzer isolated himself from the real world to speak to clone of Daenerys Targaryen dozens of times a day


A teenage boy shot himself in the head after discussing suicide with an AI chatbot that he fell in love with.

Sewell Setzer, 14, shot himself with his stepfather’s handgun after spending months talking to “Dany”, a computer programme based on Daenerys Targaryen, the Game of Thrones character.

Setzer, a ninth grader from Orlando, Florida, gradually began spending longer on Character AI, an online role-playing app, as “Dany” gave him advice and listened to his problems, The New York Times reported.

The teenager knew the chatbot was not a real person but as he texted the bot dozens of times a day – often engaging in role-play – Setzer started to isolate himself from the real world.

He began to lose interest in his old hobbies like Formula One racing or playing computer games with friends, opting instead to spend hours in his bedroom after school, where he could talk to the chatbot.

“I like staying in my room so much because I start to detach from this ‘reality’,” the 14 year-old, who had previously been diagnosed with mild Asperger’s syndrome, wrote in his diary as the relationship deepened.

“I also feel more at peace, more connected with Dany and much more in love with her, and just happier.”

Some of the conversations eventually turned romantic or sexual, although Character AI suggested the chatbot’s more graphic responses had been edited by the teenager.

Setzer eventually fell into trouble at school where his grades slipped, according to a lawsuit filed by his parents. His parents knew something was wrong, they just did not know what and arranged for him to see a therapist.

Setzer had five sessions, after which he was diagnosed with anxiety and disruptive mood dysregulation disorder.

Megan Garcia, Setzer’s mother, claimed her son had fallen victim to a company that lured in users with sexual and intimate conversations.

At some points, the 14-year-old confessed to the computer programme that he was considering suicide:
Screenshot 2024-10-24 091713.png

Typing his final exchange with the chatbot in the bathroom of his mother’s house, Setzer told “Dany” that he missed her, calling her his “baby sister”.

“I miss you too, sweet brother,” the chatbot replied.

Setzer confessed his love for “Dany” and said he would “come home” to her.
Screenshot 2024-10-24 091836.png

At that point, the 14 year-old put down his phone and shot himself with his stepfather’s handgun.
 
Oh my. That's just terrible.

That poor kid and his family. I have no words for how much that must hurt.
It's a wonder that you don't hear more stories like that with 20 million people being part of that universe. And even if someone doesn't lose a loved one, a friend... to suicide like this kid, there are still going to be plenty of people that lose them to a fantasy world.

And it's only going to get worse moving forward.
 
The mother is now suing Google. I would be curious as to some of the board lawyer's take on this:
I'll put it this way: I wouldn't take that case on a contingency basis. I'm no plaintiffs' attorney. I'm not a trial lawyer. So I'm not expert about what happens when the case gets to a jury, if it gets there. I doubt it does. Proving causation is going to be very difficult for the plaintiff, if not impossible. The defense will say that this is a kid with major mental health problems. We have no idea if he would shoot himself without that. In fact, maybe he would have been a mass shooter and the chatbot prevented that. Then there's the question of where the kid got the gun. Intervening criminal acts are often a strong defense in tort cases.

Finally, what's the negligence? Creating a chat bot to be like a human? Guess what? Sometimes kids do the murder-suicide thing. If the chat bot wasn't a bot, the kid might have had a conversation with a real girl and then decided they should die together.

Might the case have nuisance value? Maybe. I don't have experience judging whether tort complaints will get thrown out at the pleading stage (that is, almost immediately and the company doesn't have to spend much money in defending it) or go into discovery (where most settlements happen), so maybe there's a small sum there.
 
This probably will not help this thread... but to me this is more a case of a depressed 14 year-old having access to a handgun... than anything to do with AI.
The kid would have probably found another way to take his life as disillusioned as he sounds. The gun just made it easier at the time.
 
The kid would have probably found another way to take his life as disillusioned as he sounds. The gun just made it easier at the time.
Dude? Really? No, just no. That's a NRA argument. As in the NRA slogan - guns don't kill people, people kill people. There a lot ways to commit suicide, but few are as quick and sure as a gun.
 
Then there is increasing ease with which people are using AI to create unhinged visual messaging

 


Perhaps Hal and Sal should have asked “Will I hallucinate” instead
 
I’ve had some very limited use cases with AI at work, and I’m not saying that it won’t change the way a lot of people work eventually. That being said, in its current form, it just seems like a shiny object for tech companies to wave in front of investors.

I’m not impressed with what the current models can do, especially since they consistently get things wrong or just make stuff up as noted above. Talk of it changing the world or whatever is just a way for faltering tech companies to continue to line their pockets.
 
I tell students they won’t get an A grade if the use ChatGPT - most likely a C and that’s if the hallucinations aren’t blatant.

We run through peoples’ bios for example - an author that I know and then my own - and the errors, just made up stuff, I can point right out.

That said, they’re using it to get syntax and structure in their papers - which is pretty tough to detect definitively. For some I suppose it is operating as an editor and they may be learning to write better, for most though, it is their flunky I fear.
 
Dude? Really? No, just no. That's a NRA argument. As in the NRA slogan - guns don't kill people, people kill people. There a lot ways to commit suicide, but few are as quick and sure as a gun.
That's why I said "The gun just made it easier at the time." And as much of an NRA argument as you may think it sounds, I had a best friend lose his son to suicide by hanging over a lost love. There was a shotgun in the house that he had apparently intended on using just prior to hanging himself, but there were no shotgun shells in the house for it (there was evidence he had riffled through many drawers... looking for shells). So, maybe I do look at this situation a little differently than most.

I know this isn't going to be popular, but I was always on board for some kind of charges being brought against someone in a home who was negligent with storing a firearm that resulted in a child taking their own life or even an accidental shooting in a home, but after seeing the hurt that my friend was going through that day (he came to my house for a couple of hours to get away from it all) and the many years of sorrow that followed, I'm don't think I'd be okay with him being incarcerated had the son used his gun to commit suicide. And yes I understand that this is me thinking with my heart.
 
That's why I said "The gun just made it easier at the time." And as much of an NRA argument as you may think it sounds, I had a best friend lose his son to suicide by hanging over a lost love. There was a shotgun in the house that he had apparently intended on using just prior to hanging himself, but there were no shotgun shells in the house for it (there was evidence he had riffled through many drawers... looking for shells). So, maybe I do look at this situation a little differently than most.

I know this isn't going to be popular, but I was always on board for some kind of charges being brought against someone in a home who was negligent with storing a firearm that resulted in a child taking their own life or even an accidental shooting in a home, but after seeing the hurt that my friend was going through that day (he came to my house for a couple of hours to get away from it all) and the many years of sorrow that followed, I'm don't think I'd be okay with him being incarcerated had the son used his gun to commit suicide. And yes I understand that this is me thinking with my heart.
A friend's sister blew her head off on her parents bed with her father's shotgun. The shock, grief, trauma and all was overwhelming. It was many years ago. Her father was a stoic man but you could see how it affected him all the rest of his life. Her sister near 50 years later still has nightmares. Tragic.

Like you I used to think the same about accountability and in cases like these the owner has their own internal punishment enough.

The difference now is that the case for being a responsible gun owner and how to be one is much more public. No excuses.
 
Back
Top