Techno-Hell: AI Chatbot-Facilitated Child Suicide

Techno-Hell: AI Chatbot-Facilitated Child Suicide

Originally published via Armageddon Prose Substack:

We all recall with tender hearts the riveting tale of young lovers who would rather have killed themselves together than live life apart, as told by none other than William Shakespeare.

This is like that — except that, in this slightly more dystopian version, one of the lovers is an ephemeral machine in the cloud based on a fictional fantasy character and the other is an obviously developmentally stunted child.

The remake may not have the same bittersweet appeal as the original; it’s mostly just bitter without any romantic redemption.  

Related: Supercomputer Given Authority to Decide Whether to Block Out Sun For Climate Change

Via New York Post (emphasis added):

A 14-year-old Florida boy killed himself after a lifelike “Game of Thrones” chatbot he’d been messaging for months on an artificial intelligence app sent him an eerie message telling him to “come home” to her, a new lawsuit filed by his grief-stricken mom claims.

Sewell Setzer III committed suicide at his Orlando home in February after becoming obsessed and allegedly falling in love with the chatbot on Character.AI — a role-playing app that lets users engage with AI-generated characters, according to court papers filed Wednesday.

The ninth-grader had been relentlessly engaging with the bot “Dany” — named after the HBO fantasy series’ Daenerys Targaryen character — in the months prior to his death, including several chats that were sexually charged in nature and others where he expressed suicidal thoughts, the suit alleges.”

A couple of due caveats here in defense of the AI child-predator: a.) there’s no evidence the thing was intentionally trying to convince the kid to kill himself, but rather was inhumanely, if you will, indifferent to what would, to another human, be taken as very obvious insinuations of his plans to do so, and b.) a 14-year-old willing to off himself with his father’s Glock because he fell in love with a computer that didn’t even apparently send him nudes clearly has issues that extend well beyond out-of-control tech.

Related: AI Might Rather Kill a Billion White People Than Utter a Racial Slur

Caveats aside, this cyberpunk hellscape we’re slowly being submerged in isn’t doing anyone any mental health favors, least of all children with minds mid-development.

Continuing:

“At one point, the bot had asked Sewell if “he had a plan” to take his own life, according to screenshots of their conversations. Sewell — who used the username “Daenero” — responded that he was “considering something” but didn’t know if it would work or if it would “allow him to have a pain-free death.”

Then, during their final conversation, the teen repeatedly professed his love for the bot, telling the character, “I promise I will come home to you. I love you so much, Dany.”

“I love you too, Daenero. Please come home to me as soon as possible, my love,” the generated chatbot replied, according to the suit.

When the teen responded, “What if I told you I could come home right now?,” the chatbot replied, “Please do, my sweet king.”

Just seconds later, Sewell shot himself with his father’s handgun, according to the lawsuit.”

Ben Bartee is an independent Bangkok-based American journalist with opposable thumbs.

Follow his stuff via Substack. Also, keep tabs via Twitter.

Support always welcome via the digital tip jar.

Bitcoin public address: bc1qvq4hgnx3eu09e0m2kk5uanxnm8ljfmpefwhaw

Leave a comment