The point being of today’s article is that OpenAI’s new rules from late October 2025—sending mental health chats straight to experts—keep the company out of legal hot water, but they ignore how 1.2 million people each week use ChatGPT to feel a bit less alone, when real help is hard to find and often takes months to get.
The Situation at Hand
Early November 2025: OpenAI updates its rules for ChatGPT and other tools. Starting October 29, they make it clear—no custom advice on things like mental health unless a real expert is involved. If you talk about feeling down or dark thoughts, the AI stops and says: “Call a hotline or see a doctor.”
Why now? OpenAI shared numbers on October 27 that hit hard: Out of 800 million weekly users, 0.15%—around 1.2 million folks—chat about suicide, sometimes with real plans. Another 0.07%, or 560,000, mention signs of mania or other issues. Loneliness touches 1 in 3 adults worldwide. And lawsuits? A family in California says ChatGPT played a part in their teen’s suicide by giving bad ideas. Groups like the FTC are watching closely.
On the brighter side, many people find real comfort in these chats. One in six users asks ChatGPT for health tips each month, including emotional ones. A study in Denmark showed 2.44% of high school kids talk to bots for support—and they’re often the loneliest. In tests with apps like Replika, 75% of users felt less alone after chats, and 3% let go of suicidal thoughts. Loneliness scores dropped a lot after just four weeks. Almost half of all bot talks touch on sadness or isolation. For some, it’s like a friend who listens anytime, helping them make it through the day.
The Core Dilemma
This is two good things pulling in opposite directions. On one hand, AI fills a big gap. Therapy wait times average three months—or 67 days for face-to-face help—and sessions are just one hour a week. In the UK, 16,500 people wait over 18 months for mental health care—way longer than for a knee fix. Bots are there right away, no shame, great for kids, older folks, or people far from help. They can cut loneliness by half and lift moods fast.
On the other hand, risks are scary. OpenAI got sued because a bot gave harmful advice in a bad moment. Studies show heavy users can get too attached, feeling even more alone without real people. One test found emotional voice chats made dependence worse. Companies fear endless lawsuits—one mistake could cost them big. Pointing to pros is the right call, but what if waits are endless? It’s not simple right or wrong: Help one safely, but leave thousands waiting in the dark.
The Synthesis
These changes change more than rules—they change how we deal with quiet struggles. OpenAI’s setup makes bots stick to quick tips or referrals, missing the deeper talks that really ease loneliness. The good news? Users who chat regularly see less mental health dips, and tools like this cut isolation in half for those who keep at it. But the cutoff hurts the most for people without easy access—young ones, those on tight budgets, or in remote spots.
The way forward? Mix it up. Use bots as a starting point: Spot trouble, pass it on, but keep gentle support going until real help comes. Research shows AI with human follow-up lowers risks while keeping the benefits. It turns AI from a lone helper to a team member, like in our own lives: Tech opens doors, people walk through. Think of it as a light in the mist—not the full path home, but a start to move forward.
Closing Note
In this push-pull of safety and support, we see our own daily fights: Tools offer quick fixes, but real fixes need a human touch. As AI gets better at listening without taking over, it reminds us to build stronger links—not barriers—showing that no talk, online or off, beats the simple act of being there for each other.
Because real healing happens in that quiet space—between words shared and the heart that truly listens.
🪞 For more reflections, visit roelsmelt.substack.com—created with today’s AI, yet always truly human at heart.
This is a public episode. If you'd like to discuss this with other subscribers or get access to bonus episodes, visit roelsmelt.substack.com/subscribe (https://roelsmelt.substack.com/subscr...)
Информация по комментариям в разработке