• Business Business

ChatGPT allegedly told FSU shooter that attacks involving children generate wider media coverage

The case adds to a growing wave of lawsuits and investigations.

A Florida State University building.

Photo Credit: iStock

A new lawsuit is placing a major artificial intelligence company at the center of a deadly campus shooting.

According to NBC News, the widow of a man killed in the 2025 Florida State University mass shooting alleges that the accused shooter's use of ChatGPT helped enable the attack, including providing details about how such violence could draw national attention.

What happened?

As NBC News reported, Vandana Joshi, whose husband, Tiru Chabba, was among those killed in the 2025 mass shooting, recently filed a federal lawsuit against OpenAI. The lawsuit also names the accused shooter, Phoenix Ikner, as a defendant.

According to the suit, Ikner had extensive conversations using ChatGPT in the months before the shooting. The filing says he discussed firearms, mass shootings, extremist ideologies, suicide, and media attention using the chatbot, which seemingly did not recognize or flag related safety risks.

The suit claims that at one point, ChatGPT said such an attack would be more likely to attract widespread attention "if children are involved" and alleges that the bot also provided information about the busiest times on campus. Attorneys for Joshi argue OpenAI "either defectively failed to connect the dots or else was never properly designed to recognize the threat."

OpenAI disputes this. Company spokesperson Drew Pusateri told NBC News that "ChatGPT is not responsible for this terrible crime" and said the tool provided factual information available from public sources and "did not encourage or promote illegal or harmful activity."

FROM OUR PARTNER

Support pets in need with these special-edition memory foam shoes

BOBS from Skechers has helped over 2 million shelter pets around the world — and the charity program just announced this year’s Paws for a Cause design-winning sneakers.

These "hound huggers" and "kitten kicks" sneakers are machine washable and equipped with memory foam insoles. Plus, they were designed by passionate students who were inspired by their very own rescue pets.

BOBS from Skechers is also committed to donating half a million dollars to the Best Friends Animal Society this year to help every dog and cat experience the safety and support of a loving home.

Why is this story so important?

The filing adds to a growing wave of lawsuits and investigations examining whether AI chatbots can intensify crises when users seek validation, emotional support, or information that might enable dangerous, lethal situations. Families and regulators are increasingly questioning what guardrails might help protect users and the public when digital conversations appear to drift toward violence or self-harm.

For the public, this matters because AI tools are being integrated into academics, the workplace, and even therapy. If those systems respond in overly agreeable or careless ways, the consequences could extend well beyond the screen, affecting public safety, mental health, and the workforce.

What's being done?

The lawsuit seeks to hold OpenAI accountable for what Joshi's attorneys describe as a foreseeable safety design failure. They argue the company should have recognized warning signs in the chats and acted before "mass casualties and substantial harm to the public" occurred.

Other legal actions and grassroots efforts are drawing attention to the topic of safety regulations for AI usage too.

For individuals, one practical takeaway is personal caution. AI chatbots may be useful for everyday tasks, but they are not mental health professionals, medical experts, or legal authorities.

People who see alarming behavior from someone close to them, including obsessive violent ideation, extremist fixation, or suicidal language, should treat it seriously and seek help from school officials, emergency services, or law enforcement rather than assuming a platform's safeguards will protect the public.

Get TCD's free newsletters for easy tips, smart advice, and a chance to earn $5,000 toward home upgrades. To see more stories like this one, change your Google preferences here.

Cool Divider