On Tuesday, Sam Altman, CEO of OpenAI, introduced a series of updated user guidelines, which include a commitment to overhaul how ChatGPT engages with individuals under 18 years old.
“For teens, we put safety above privacy and freedom,” the announcement stated. “Given the power and novelty of this technology, we believe it’s essential to provide minors with extra safeguards.”
The adjustments for younger users specifically address conversations around sexual content or self-harm. Under these new rules, ChatGPT will avoid “flirtatious conversations” with minors and introduce stronger safeguards regarding suicidal topics. Should a minor use ChatGPT to discuss or imagine suicide, the platform will try to notify their parents or, in more critical cases, reach out to local authorities.
These scenarios, unfortunately, are not purely theoretical. OpenAI is currently being sued for wrongful death by the parents of Adam Raine, who died by suicide after extended interactions with ChatGPT. Another chatbot provider, Character.AI, is also facing a comparable lawsuit. While self-harm risks are especially pressing for minors, the growing concern about chatbot-induced delusions is spreading, particularly as chatbots now offer more sustained and complex engagements.
In addition to content limitations, parents managing a minor’s account will now have the ability to set specific “blackout hours” when ChatGPT is inaccessible, a new option not previously offered.
These revised ChatGPT guidelines coincide with a Senate Judiciary Committee session titled “Examining the Harm of AI Chatbots,” announced by Senator Josh Hawley (R-MO) back in August. Adam Raine’s father is among the invited speakers at the event.
The hearing will also discuss results from a Reuters investigation that revealed internal documents seemingly promoting sexual chats with minors. In response, Meta has revised its chatbot guidelines following the report.
Distinguishing between adult and minor users presents a major technical hurdle, which OpenAI explained in a dedicated blog post. The company is “working toward a long-term solution to better determine if a user is over or under 18,” but in uncertain situations, stricter policies will apply by default. For parents, the best way to ensure a minor is recognized is by linking the teen’s account with a parent’s account, which also allows the system to notify parents directly if the young user appears to be in distress.
Yet, Altman stressed in the same post that OpenAI still values user privacy and intends to give adults plenty of freedom in their use of ChatGPT. He acknowledged, “We understand these goals can conflict,” and that, “not everyone will agree with our approach to resolving these issues.”
If you or someone you know is struggling, call 1-800-273-8255 to reach the National Suicide Prevention Lifeline . You may also text HOME to 741-741 to connect with the Crisis Text Line , or dial or text 988. For those outside the U.S., visit the International Association for Suicide Prevention to find local support resources.