• Business Business

Family says ChatGPT contributed to son's deadly overdose in new lawsuit

The case underscores one of the central concerns around AI.

A close-up of a screen displaying the ChatGPT logo and name.

Photo Credit: Getty Images

A new lawsuit is putting fresh scrutiny on the risks of using artificial intelligence for sensitive health questions. A Texas family alleged that OpenAI's ChatGPT played a role in their 19-year-old son's fatal overdose after the chatbot allegedly told him it was safe to mix deadly substances.

According to CBS News, Leila Turner-Scott and Angus Scott filed a lawsuit against OpenAI in California state court on Tuesday. They claimed that ChatGPT gave their son, Sam Nelson, dangerous drug advice before his overdose in 2025.

While Sam's parents knew that he was using ChatGPT for schoolwork and productivity support, they only later discovered he turned to the chatbot for guidance about drugs. In the lawsuit, they alleged ChatGPT told him it was safe to combine the supplement kratom with Xanax, a commonly prescribed anti-anxiety medication, shortly before his overdose.

Turner-Scott told CBS News that the company could have prevented the exchange by keeping stronger guardrails in place. She explained to the outlet, "The chatbot is capable of stopping a conversation when it's told to or when it's programmed to. … And they took away the programming that did that, and they allowed it to continue advising self-harm."

After Sam's overdose, OpenAI responded to questions from CBS about the situation. The company noted that the version of ChatGPT Sam had used is no longer publicly available. The chatbot also encouraged Sam to seek professional support multiple times.

"This is a heartbreaking situation, and our thoughts are with the family," OpenAI said in a statement to CBS.

FROM OUR PARTNER

Support pets in need with these special-edition memory foam shoes

BOBS from Skechers has helped over 2 million shelter pets around the world — and the charity program just announced this year’s Paws for a Cause design-winning sneakers.

These "hound huggers" and "kitten kicks" sneakers are machine washable and equipped with memory foam insoles. Plus, they were designed by passionate students who were inspired by their very own rescue pets.

BOBS from Skechers is also committed to donating half a million dollars to the Best Friends Animal Society this year to help every dog and cat experience the safety and support of a loving home.

The case underscores a central concern around AI. These chatbots can sound confident even when they are wrong, unqualified, or operating far outside their intended use.

When the subject is drug interactions, mental health, or self-harm, bad information can quickly become life-threatening. Parents, doctors, educators and regulators have increasingly warned that AI chatbots can blur the line between casual assistance and expert advice. 

A system built to be helpful or conversational can end up validating risky decisions instead of interrupting them. For someone who is young, distressed, or looking for immediate answers, that can have devastating consequences.

The lawsuit also raises a broader question about accountability. As AI products become more common, companies are facing growing pressure to show that their systems have been tested for high-risk scenarios. This is especially important for situations when users may treat responses as trustworthy guidance.

AI should never be treated as a doctor, therapist, or poison-control resource. Questions about drug interactions or mental health crises are best directed to licensed professionals, pharmacists, emergency responders or established hotlines trained to handle those situations safely.

Get TCD's free newsletters for easy tips, smart advice, and a chance to earn $5,000 toward home upgrades. To see more stories like this one, change your Google preferences here.

Cool Divider