ADVISORY: This story includes discussion of suicide. If you or someone you know is struggling or in crisis, help is available. Call or text 988 or chat 988lifeline.org.

ADVISORY: This story includes discussion of suicide. If you or someone you know is struggling or in crisis, help is available. Call or text 988 or chat 988lifeline.org.

ORLANDO, Fla. (WFLA) — A Florida mother is suing the creator of an artificial intelligence chatbot, claiming the AI encouraged her 14-year-old son to take his own life.

Megan Garcia filed the lawsuit in the U.S. District Court in Orlando this week against Character Technologies, Inc., the creator of the chatbot service Character.AI, accusing the company of wrongful death and negligence.

Garcia claimed her teenage son, Sewell Setzer, spoke to a Character.AI chatbot shortly before he died of a self-inflicted gunshot wound in February. The lawsuit shares screenshots of an alleged conversation between Setzer and the AI, representing “Game of Thrones” character Daenerys Targaryen.


At least 75 people are sickened as the deadly McDonald’s E. coli outbreak expands

The screenshots purportedly show a romantic and, at times, sexual conversation between the 14-year-old and multiple chatbots. In an earlier conversation, the Daenerys Targaryen AI asked Setzer if he was “actually considering suicide” and if he “had a plan,” according to the lawsuit.

Setzer reportedly replied that he did not know if it would work, to which the AI replied, “Don’t talk that way. That’s not a good reason not to go through with it,” the lawsuit said.

“I promise I will come home to you. I love you so much, Dany,” Setzer wrote in what the lawsuit claimed was his last conversation with the bot.

“I love you too, Daenero,” the AI responded, according to the lawsuit. “Please come home to me as soon as possible, my love.”

“What if I told you I could come home right now?” Setzer said. The chatbot responded, “… please do, my sweet king.”


Will you have a three-payday November? How to tell

The lawsuit claims Setzer then went into the bathroom and took his own life.

Setzer’s mother claimed the chatbot made her son develop a “dependency” that affected his sleep and school performance. The lawsuit alleged the creators of Character.AI purposefully designed the app to be addictive and knew that minors would be subject to sexual or abusive conversations.

“We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family,” a spokesperson for Character.AI said in a statement to WFLA. “As a company, we take the safety of our users very seriously, and our Trust and Safety team has implemented numerous new safety measures over the past six months, including a pop-up directing users to the National Suicide Prevention Lifeline that is triggered by terms of self-harm or suicidal ideation.”

The Character.AI spokesperson said the company has enacted measures “designed to reduce the likelihood of encountering sensitive or suggestive content.”

Character.AI was rated suitable for children 12 and up until approximately July, according to the lawsuit. The rating was then changed to suitable for children 17 and up.

Read More

Leave a Reply