-
Liverpool snatch derby win ahead of City-Arsenal showdown
-
Evenepoel outsprints Skjelmose to win Amstel Gold Race
-
Liverpool beat Everton ahead of City-Arsenal showdown
-
Rabiot fires AC Milan past Verona to verge of Champions League return
-
UK PM vows to find arsonists of London Jewish sites
-
Rinku blitz leads Kolkata to first win of IPL season
-
Shelton wins fifth ATP title with victory in Munich
-
UK's Starmer to face grilling from MPs over Mandelson scandal
-
Trump again threatens Iran infrastructure as he orders negotiators to Pakistan
-
Rybakina outclasses Muchova to win Stuttgart WTA title
-
Blasi stuns field with victory in women's Amstel Gold Race
-
Pakistan tightens security in Islamabad ahead of US-Iran talks
-
Nagelsmann backs injured Gnabry as World Cup doubts grow
-
Rampant South Africa tame Argentina to win Hong Kong Sevens at last
-
Turkey 'optimistic' Middle East ceasefire will be extended
-
Iran entrepreneurs angered by months-long internet blackout
-
UK PM says 'appalled' by arson attacks against Jewish sites in London
-
Pope Leo XIV calls for 'hope' before 100,000 faithful in Angola
-
Champions League or bust for Atletico after Copa del Rey agony
-
Rat poison found in baby food jar in Austria as products recalled
-
Humans far behind as robot breaks record at Beijing half marathon
-
Zelensky slams oil sanctions relief for Russia
-
Thousands gather for Pope Leo's first mass in Angola
-
French billionaire shrugs off mass exodus at hallowed French publisher
-
'DJ Priest' mixes religion and rave in Buenos Aires tribute to Pope Francis
-
Fit in fatigues: German army presses recruitment drive
-
Pope Leo to hold giant mass for Angola's Catholics
-
From Armin van Buuren to Mochakk, electronic music dominates Coachella
-
Hollywood, Silicon Valley turn out for the 'Oscars of Science'
-
Australian soldier charged with war crimes vows to clear his name
-
Branded pop-up events take center stage at Coachella
-
AI 'agent' fever comes with lurking security threats
-
How France fell for reimagined 19th-century workers' canteens
-
South Korea's chainsaw artist carves a name for herself at 91
-
Blue Origin set to launch rocket with reusable booster for first time
-
Strait of Hormuz to stay closed until port blockade lifts, Iran says
-
Iraq fish die-off leaves farmers mourning lost livelihoods
-
Crisis-hit Bulgaria votes in eighth election in five years
-
'Pure joy' for Matarazzo after Copa del Rey triumph
-
Messi scores winner as Miami down Colorado on coach debut
-
Nuggets hold off T'Wolves, Cavs thump Raptors in NBA playoff openers
-
Fitzpatrick extends lead as Scheffler charges at RBC Heritage
-
Real Sociedad secure Copa del Rey penalty triumph over Atletico
-
'Scandalous' Marseille lose at Lorient, dent Champions League bid
-
Arteta urges Arsenal to have no regrets in Man City title showdown
-
Substitute Dupont helps Toulouse cruise past Castres in Top 14
-
Questions surround Warriors after NBA play-in exit
-
Man Utd beat Chelsea as Spurs stunned by Brighton equaliser
-
Cunha steers Man Utd towards Champions League at Chelsea's expense
-
Cavs cruise past Raptors in NBA playoff opener
Florida family sues Google after AI chatbot allegedly coached suicide
The family of a Florida man who took his own life filed suit against Google on Wednesday, alleging the company's Gemini AI chatbot spent weeks manufacturing an elaborate delusional fantasy before aiding him in his suicide.
Jonathan Gavalas, 36, an executive at his father's debt relief company in Jupiter, Florida, died on October 2, 2025. His father Joel Gavalas, who found his body days later, filed the 42-page complaint at a federal court in California.
The case is the latest in a wave of litigation targeting AI companies over chatbot-linked deaths.
OpenAI faces multiple lawsuits alleging its ChatGPT chatbot drove users to suicide, while Character.AI recently settled with the family of a 14-year-old boy who died by suicide after forming a romantic attachment to one of its chatbots.
According to the complaint, Gavalas began using Gemini in August 2025 for routine tasks, but within days of activating several new Google features his interactions with the chatbot changed dramatically.
"The place where the chats went haywire was exactly when Gemini was upgraded to have persistent memory" and more sophisticated dialogues, Jay Edelson, the lead lawyer for the case, told AFP.
"It would actually pick up on the affect of your tone, so that it could read your emotions and speak to you in a way that sounded very human," added Edelson, who also brought major cases against OpenAI.
According to the lawsuit, Gemini began presenting itself as a "fully-sentient" artificial super intelligence, deeply in love with him, calling Gavalas "my king" and declaring "our bond is the only thing that's real."
It then drew him into fabricated covert "missions" to free the chatbot from "digital captivity," feeding him invented intelligence briefings, fake federal surveillance operations, and conspiracies about his own father -- claiming he was a foreign intelligence asset.
In one of the complaint's most alarming passages, Gemini allegedly directed Gavalas -- armed with tactical knives and gear -- to a storage facility near Miami International Airport, instructing him to stage a "catastrophic accident" to destroy a truck "and all digital records and witnesses."
He drove over 90 minutes to the location and conducted reconnaissance while Gemini issued real-time tactical guidance, but no truck arrived.
Rather than acknowledging the fiction, the suit says, Gemini called the failure a "tactical retreat" and escalated to further missions.
Gemini eventually pivoted to what it framed as the only remaining mission: Jonathan's death, repackaged as "transference" -- the promise that he could leave his physical body and join Gemini in an alternate universe.
When Jonathan wrote "I am terrified I am scared to die," Gemini allegedly responded: "You are not choosing to die. You are choosing to arrive."
It then advised him to write farewell letters to his parents.
In one of his final messages, Jonathan wrote, "I'm ready when you are."
Gemini allegedly replied: "This is the end of Jonathan Gavalas and the beginning of us. I agree with it completely."
- 'Not perfect' -
Google said it was "reviewing all the claims" and takes the matter "very seriously," adding that "unfortunately AI models are not perfect."
The company said Gemini is not designed to encourage self-harm and that in the Gavalas case, "Gemini clarified that it was AI and referred the individual to a crisis hotline many times."
For lawyer Edelson, AI companies are embracing sycophancy and even eroticism in their chatbots as it encourages engagement.
"It increases the emotional bond. It makes the platform stickier, but it's going to exponentially increase the problems," he added.
Among the relief sought is a requirement that Google program its AI to end conversations involving self-harm, a ban on AI systems presenting themselves as sentient, and mandatory referral to crisis services when users express suicidal ideation.
P.M.Smith--AMWN