It might sound a bit sci-fi, but AI tools like ChatGPT are creeping into spaces we usually reserve for humans, like therapy.

Perhaps surprisingly, they’re not just holding their own. According to new research published in Nature, ChatGPT’s responses to emotional dilemmas are sometimes rated more highly than advice given by real-life professionals. That doesn’t mean we’re replacing therapists anytime soon, but it’s sparking serious conversations about how tech could reshape support, mental health, and the future of care. Here’s what the studies are saying, and what it actually means for real people.
1. ChatGPT’s advice scored higher on empathy and helpfulness.

Researchers found that ChatGPT’s responses to personal dilemmas were rated as more empathetic, complete, and helpful than the advice written by professional columnists. This wasn’t a one-off observation—it held across different kinds of questions and topics.
What’s interesting is that these weren’t carefully curated, cherry-picked responses. They were typical ChatGPT outputs, up against advice from respected humans. People just felt the AI responses were more thoughtful. It challenges the assumption that real empathy can only come from a human being.
2. People still prefer humans when it’s their own life.

Even though the AI responses scored higher in blind comparisons, most participants still said they’d rather get advice from a human if it was their own issue. That says something about how deeply we crave connection—especially when we’re vulnerable.
So while AI might technically write a more polished, balanced reply, the idea of turning to a machine when you’re falling apart doesn’t quite sit right for everyone. The emotional weight of being understood still feels like a job for an actual person.
3. In couple’s therapy scenarios, ChatGPT outperformed human therapists.

Another study, published in PLOS Mental Health, showed that ChatGPT’s responses in fictional couples therapy scenarios were rated as more helpful and insightful than those from real therapists. The AI gave longer, more detailed answers that felt nuanced and non-judgemental.
That doesn’t mean therapists are slacking. It just shows that AI can be trained to reflect back information in a way that feels incredibly affirming—and in some cases, even more articulate than a rushed professional trying to fit everything into a short session.
4. Most people couldn’t tell which replies were written by AI.

Participants in these studies weren’t told who wrote which answers. When they were asked to guess, they often couldn’t tell the difference between ChatGPT and a human. And when they did guess, they were frequently wrong. This blurring of the lines is both fascinating and a little unsettling. It suggests that, at least on the surface, AI can mimic the tone and pacing of real human empathy well enough to pass unnoticed.
5. ChatGPT’s answers were more balanced and less reactive.

One possible reason the AI scored so well is that its answers tend to be less reactive or emotionally loaded. It doesn’t jump to conclusions, and it’s not operating from personal bias, ego, or stress. That makes its advice come across as measured and calm. In emotionally charged situations, that tone can be incredibly grounding. People feel listened to, even if the listener is a machine. There’s no judgement, just reflection—and that’s often all someone needs to feel better.
6. AI still lacks the warmth of real connection.

Despite the strong ratings, researchers pointed out that something was missing in the AI responses. There’s a kind of warmth and unspoken safety that comes from human interaction, and that’s not something algorithms can fake, no matter how well they phrase things. This human quality is what helps people open up in therapy. It’s about eye contact, body language, shared laughter, or just knowing someone’s there with you. AI can simulate support, but it can’t replace presence.
7. AI struggles with certain therapy skills.

Professional therapists are trained to do more than just give good advice. They know how to build trust over time, read between the lines, and guide people through long-term changes. AI, no matter how clever, doesn’t know when to push and when to pause. Things like reading subtle cues, building rapport, or helping someone unpack trauma in a safe, paced way—these are skills that require intuition and care. AI is catching up in tone, but it’s still miles behind in instinct.
8. People trust humans more with their long-term mental health.

Even participants who liked ChatGPT’s advice said they wouldn’t rely on it for serious or ongoing struggles. There’s a line between “useful reflection” and “actual support,” and most people instinctively know the difference. AI might be great for one-off venting, journaling prompts, or reframing a single issue. But for complex emotions that evolve over time, most of us still want someone who can track with us—remember our context, notice our growth, and adapt their care.
9. ChatGPT is better as a companion tool, not a replacement.

The real takeaway from all this isn’t that AI should replace therapy—it’s that it could become a really useful supplement. Think of it like emotional first aid. Something you can turn to when you need to talk, reflect, or explore thoughts before speaking with a therapist.
For people on waitlists or unable to afford therapy, AI can offer helpful guidance in the meantime. For those already in therapy, it might even help organise thoughts between sessions. It’s not an either-or—it’s an add-on.
10. It’s great for people who feel awkward asking for help.

Some people struggle to open up in traditional therapy settings. The pressure to be honest, articulate, or emotionally expressive on demand can be intimidating. That’s where AI has a quiet superpower—it doesn’t rush or react. It gives people space to process at their own pace, without feeling judged. For those who need to build emotional confidence before seeking human support, this can be a gentle place to start.
11. We still need ethics and transparency, of course.

There’s a big difference between using AI as a journaling buddy and mistaking it for a licensed mental health provider. Platforms that integrate AI into care settings need to be upfront about its limits, and users need to be aware that this isn’t therapy in the clinical sense. Boundaries matter here. The last thing we want is vulnerable people relying too heavily on something that wasn’t designed to hold that kind of emotional weight. Clarity is key.
12. The future of therapy might be hybrid.

Rather than viewing AI and human therapy as competing forces, some experts are imagining a more blended approach. Where therapists use AI tools to help track patterns, offer between-session support, or personalise care.
It’s still early days, but the idea of tech-enhanced emotional care is becoming less science fiction and more strategy. As long as we keep humans at the centre, AI could actually improve the accessibility and quality of mental health support for everyone.