ChatGPT’s New Safety Feature Will Alert Trusted Contact If It Detects Risk of Self-Harm

OpenAI’s ChatGPT is now expanding its AI safety features.

OpenAI is now rolling out "Trusted Contact," a new safety feature available within ChatGPT, and its main focus is to immediately alert a designated contact person whenever it detects risk of self-harm from a user.

ChatGPT Will Alert Trusted Contact If It Detects Risk of Self-Harm

A new blog post shared by OpenAI introduces its new safety feature called Trusted Contact, and this will ask users to designate a point person for whenever an emergency or risk situation takes place.

Users above 18 years old can now choose to nominate someone as their Trusted Contact as an extension of the platform's safety features.

According to OpenAI, many people use ChatGPT for many reasons, but some have been conversing with the chatbot about their lives, struggles, and deeply personal things.

While many of these users ask ChatGPT for support, others may already hint that they want to commit self-harm or end their life, and these are some of the signs that the chatbot will now look out for on the platform.

OpenAI's New Safety Feature For ChatGPT

According to Engadget, the new safety feature on ChatGPT aims to avoid past cases wherein OpenAI and the generative AI chatbot were at the receiving end of wrongful death claims for enabling suicide.

One of the biggest features that came after a particular incident involving the suicide of a teenager is the Parental Controls for teenager accounts, which will immediately notify parents if ChatGPT detects any risk of self-harm.

Now, OpenAI expands this to users above 18 in a bid to protect legal-aged individuals who are no longer bound by parental supervision.

The company said that Trusted Contacts is not fully AI-controlled as a small team of "specially trained people" will help review the situation and determine if there is a risk. After which, ChatGPT will alert a trusted contact via text, email, or in-app notification.

ⓒ 2026 TECHTIMES.com All rights reserved. Do not reproduce without permission.

Join the Discussion