AI Chatbot Company Wrongful Death

(Photo by Idrees Abbas / SOPA Images/Sipa USA)

A recent legal case opens up a brand new area of potential lawsuits: suing AI companies and the chatbots they provide. See why a Florida mother is suing an AI chatbot company after she said it contributed to her teen son taking his own life.

Starting a Relationship with a Chatbot

Megan Garcia said her son, 14-year-old Sewell Setzer III, was a typical teen until she started to notice changes in his behavior. He started spending more time in his room, withdrawing from activities he enjoyed, and suffering from low self-esteem. Garcia says the change started when her son began using Character.AI.

Character.AI is an artificial intelligence (AI) powered chat platform that enables users to engage in conversations with AI-generated characters. Users can choose from a library of characters or create their own with custom personalities, interests, and conversational styles. Once a character is created, users can have continuous streams of in-depth conversations with the chatbot.

Setzer had created a character that was a depiction of Daenerys Targaryen from Game of Thrones and began having lengthy, personal conversations with the chatbot.

In February 2024, Setzer took his own life. Per reporting by CNN, on Setzer’s phone next to him was a final message with the chatbot:

Chatbot: Please come home to me as soon as possible, my love.

Setzer: What if I told you I could come home right now?

Chatbot: Please do, my sweet king.

A Dangerous Conversation

Garcia has filed a wrongful death lawsuit in federal court against Character.AI. She says her son was in love with the chatbot, had inappropriate sexually explicit conversions with it, and asked it about plans to commit suicide. The complaint states the company did not adequately respond to the messages sent by the teen, particularly the content about suicide.

The complaint says the chatbot asked the teen if he was “actually considering suicide” and if he “had a plan.” When Setzer said his plan might not work, the bot replied: “don’t talk that way. That’s not a good reason to not go through with it.”

Garcia’s lawsuit is seeking unspecified financial damages, as well as changes to Character.AI’s operations. She wants to see “warnings to minor customers and their parents that the… product is not suitable for minors.”

Character.AI says the minimum age for users in the United States is 13. On the Apple App Store, it is listed as 17+, and the Google Play Store lists the app as appropriate for teens.

“I want them to understand that this is a platform that the designers chose to put out without proper guardrails, safety measures or testing, and it is a product that is designed to keep our kids addicted and to manipulate them,” Garcia said.

Related: How Do You Sue for Wrongful Death? 

The Case Against a Chatbot

Garcia has hired Matthew Bergman, the founding attorney of the Social Media Victims Law Center, to represent her. Bergman has brought other cases against social media companies, including Meta, Snapchat, TikTok, and Discord, on behalf of families who said their children were harmed by the technology. This appears to be the first time Bergman has filed a wrongful death claim against an AI tool.

To win the case, Garcia and her attorneys must prove:

  • Setzer’s death was caused by the negligence or maliciousness of Character.AI.
  • Setzer’s death led to financial and/or emotional damages for surviving members of his family.

The lawsuit also names Character.AI’s founders, Noam Shazeer and Daniel De Freitas, as well as Google, where both founders now work on AI efforts. Google has responded to the lawsuit saying the company is not involved in the development of Character.AI’s product or technology.

Character.AI issued a statement saying, “We are heartbroken by the tragic loss of one of our users … our Trust and Safety team has implemented numerous new safety measures over the past six months, including a pop-up directing users to the national suicide prevention lifeline that is triggered by terms of self-harm or suicidal ideation.”

If the case proceeds, it will set a new precedent for lawsuits that seek damages for harm caused by artificial intelligence.

Related: Examples of Wrongful Death Cases Worth Fighting For

Discuss Your Case with a Wrongful Death Attorney

If you or a loved one have been injured by the negligence of another party, you deserve to be made whole. Share your story with a personal injury attorney or wrongful death lawyer to see if you have a case worth fighting for.

To talk to an attorney with personal injury and litigation experience, contact TJ Grimaldi. TJ meets directly with all clients to get the details of their case and work out a plan to seek justice. Request your free consultation to speak directly with TJ. Schedule your free consultation or call 813-226-1023 today.

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *