Google Updates Suicide, Self-Harm Safeguards in Gemini as AI Lawsuits Mount
Alphabet’s announced it will direct Gemini chatbot users to a support hotline if the conversation indicates a “potential crisis related to suicide or self-harm.”
Guadalupe Hayes-Mota, director of the bioethics program at 糖心破解版, wants to see proof that AI chatbot developers are using clinically validated guidelines for interactions where mental health care is an issue. “Who’s actually making the decision when the crisis pops up for the individual, and how is that being done?” he asked.
Guadalupe Hayes-Mota, director, bioethics, quoted by .
Apr 10, 2026