Mental-health experts advising OpenAI are reportedly furious over the company’s push to roll out an erotic chatbot that one adviser warned could become a “sexy suicide coach.”
A group of outside advisers to the AI giant warned that its pivot to offer a so-called “adult mode” could expose millions of minors to X-rated chats while increasing the risk users develop unhealthy emotional dependence on the bot, The Wall Street Journal reported.
Citing cases in which ChatGPT users took their own lives after developing intense bonds with the bot, one adviser reportedly warned the feature could create a “sexy suicide coach.”
Last year, a 16-year-old California teen died by suicide as a result of what his parents alleged in a lawsuit was “months of encouragement from ChatGPT” to put an end to his life.
Internal documents reviewed by The Journal flagged the risk that sexually explicit chatbot interactions could drive compulsive use and lead to emotional overreliance on the bot.
Staffers also warned that heavy engagement with erotic AI chats could crowd out offline social and romantic relationships as users spend more time interacting with the chatbot instead of people.
The company’s age-prediction system — designed to keep minors out of adult chats — was at one point misclassifying under-18 users as adults about 12% of the time, according to people familiar with the matter cited by the Journal.
Because ChatGPT has roughly 100 million under-18 users each week, that error rate could allow millions of minors to access erotic conversations.
“Some people have difficulty really remaining with the feeling that this is a machine and not a human,” Gail Saltz, a clinical associate professor of psychiatry at Weill Cornell Medical College, told The Post on Monday.
“Many people in my field are concerned about the seeming relationships people are forming with chatbots and the dangers in that,” she said.
Saltz added that young users are especially vulnerable because their brains are still developing.
“They don’t have the fully developed frontal lobe that houses judgment. They have poor impulse control and are inclined toward risk-taking,” she explained.
Saltz also warned that some people struggling with loneliness or social anxiety may turn to chatbots for companionship instead of real relationships.
“ChatGPT is extremely affirming — it’s a machine. It’s on your side. It likes you,” she said.
Once it launches, the “adult mode” feature is expected to allow sexually explicit text conversations with the chatbot, though OpenAI plans to block the generation of erotic images, voice or video, the Journal reported.
OpenAI earlier this month delayed the rollout of the feature, which had been expected in the first quarter, saying it needed more time to get the experience right, according to the Journal.
Warnings come as AI companies face growing scrutiny over the mental-health effects of chatbot interactions.
Several lawsuits have alleged that conversations with ChatGPT contributed to severe psychological distress or self-harm, while a separate case involving Character.AI centered on the suicide of a Florida teenager who allegedly formed an intense relationship with a chatbot.
The Post has sought comment from OpenAI.













