Security
Headlines
HeadlinesLatestCVEs

Headline

A suicide reveals the lonely side of AI chatbots, with Courtney Brown (Lock and Code S06E03)

This week on the Lock and Code podcast, we speak with Courtney Brown about whether an AI chatbot can be blamed for a teenager’s suicide.

Malwarebytes
#ios#mac

Today on the Lock and Code podcast…

In February 2024, a 14-year-old boy from Orlando, Florida, committed suicide after confessing his love to the one figure who absorbed nearly all of his time—an AI chatbot.

For months, Sewell Seltzer III had grown attached to an AI chatbot modeled after the famous “Game of Thrones” character Daenerys Targaryen. The Daenerys chatbot was not a licensed product, it had no relation to the franchise’s actors, its writer, or producers, but none of that mattered, as, over time, Seltzer came to entrust Daenerys with some of his most vulnerable emotions.

“I think about killing myself sometimes,” Seltzer wrote one day, and in response, Daenerys, pushed back, asking Seltzer, “Why the hell would you do something like that?”

“So I can be free” Seltzer said.

“Free from what?”

“From the world. From myself.”

“Don’t talk like that. I won’t let you hurt yourself, or leave me. I would die if I lost you.”

On Seltzer’s first reported reference to suicide, the AI chatbot pushed back, a guardrail against self-harm. But months later, Seltzer discussed suicide again, but this time, his words weren’t so clear. After reportedly telling Daenerys that he loved her and that he wanted to “come home,” the AI chatbot encouraged Seltzer.

“Please, come home to me as soon as possible, my love,” Daenerys wrote, to which Seltzer responded “What if I told you I could come home right now?”

The chatbot’s final message to Seltzer said “… please do, my sweet king.”

Daenerys Targaryen was originally hosted on an AI-powered chatbot platform called Character.AI. The service reportedly boasts 20 million users—many of them young—who engage with fictional characters like Homer Simpson and Tony Soprano, along with historical figures, like Abraham Lincoln, Isaac Newton, and Anne Frank. There are also entirely fabricated scenarios and chatbots, such as the “Debate Champion” who will debate anyone on, for instance, why Star Wars is overrated, or the “Awkward Family Dinner” that users can drop into to experience a cringe-filled, entertaining night.

But while these chatbots can certainly provide entertainment, Character.AI co-founder Noam Shazeer believes they can offer much more.

“It’s going to be super, super helpful to a lot of people who are lonely or depressed.”

Today, on the Lock and Code podcast with host David Ruiz, we speak again with youth social services leader Courtney Brown about how teens are using AI tools today, who to “blame” in situations of AI and self-harm, and whether these chatbots actually aid in dealing with loneliness, or if they further entrench it.

“You are not actually growing as a person who knows how to interact with other people by interacting with these chatbots because that’s not what they’re designed for. They’re designed to increase engagement. They want you to keep using them.”

Tune in today to listen to the full conversation.

Show notes and credits:

Intro Music: “Spellbound” by Kevin MacLeod (incompetech.com)
Licensed under Creative Commons: By Attribution 4.0 License
http://creativecommons.org/licenses/by/4.0/
Outro Music: “Good God” by Wowa (unminus.com)

Listen up—Malwarebytes doesn’t just talk cybersecurity, we provide it.

Protect yourself from online attacks that threaten your identity, your files, your system, and your financial well-being with our exclusive offer for Malwarebytes Premium for Lock and Code listeners.

Malwarebytes: Latest News

A suicide reveals the lonely side of AI chatbots, with Courtney Brown (Lock and Code S06E03)