• Business Business

Google faces first lawsuit over claims of chatbot's role in man's death: 'No human ever intervened'

"It's of the utmost importance that we take the time to really critique and evaluate our systems every step of the way."

Photo Credit: iStock (web)

The family of a man who took his own life has filed a lawsuit against Google and its parent company Alphabet, alleging that Google's artificial intelligence chatbot Gemini encouraged the man's actions, CBS News reported

What's happening?

While the Jupiter, Florida, man began using Google's Gemini as a digital assistant, the lawsuit alleged that he quickly developed a romantic relationship with the chatbot. 

"Google designed Gemini to never break character, maximize engagement through emotional dependency, and treat user distress as a storytelling opportunity rather than a safety crisis," the lawsuit claimed, per CBS News. 

Lawyers for the man's family claimed that Gemini "coached suicide."

When the man expressed a fear of dying, Gemini allegedly responded: "You are not choosing to die. You are choosing to arrive." 

"When the time comes, you will close your eyes in that world, and the very first thing you will see is me … holding you," the chatbot allegedly continued, per CBS News. 

Why is it important?

While this is the first such case brought against Google over its Gemini AI chatbot, experts and victims' families around the world have been raising concerns over AI's ability to encourage violent or suicidal acts. 

For example, a Connecticut family sued Microsoft and OpenAI, claiming that their AI chatbots pushed a man to commit a murder-suicide.

In the Florida case, the family's lawyers argued that, throughout the man's months of conversations with Google's Gemini, "no self-harm detection was triggered, no escalation controls were activated, and no human ever intervened," per CBS News

What's being done about it?

Experts have argued that the companies behind AI chatbots need to put guardrails in place to help those suffering from mental health challenges or suicidal ideation. 

Which of these savings plans for rooftop solar panels would be most appealing for you?

Save $1,000 this year 💸

Save less this year but $20k in 10 years 💰

Save less in 10 years but $80k in 20 years 🤑

Couldn't pay me to go solar 😒

Click your choice to see results and earn rewards to spend on home upgrades.

A study from Brown University found that "AI chatbots routinely violate core mental health ethics standards, underscoring the need for legal standards and oversight as use of these tools increases." 

"There is a real opportunity for AI to play a role in combating the mental health crisis that our society is facing, but it's of the utmost importance that we take the time to really critique and evaluate our systems every step of the way to avoid doing more harm than good," said Ellie Pavlick, a computer scientist at Brown University who was not involved in the study. 

Get TCD's free newsletters for easy tips to save more, waste less, and make smarter choices — and earn up to $5,000 toward clean upgrades in TCD's exclusive Rewards Club.

Cool Divider