When Chatbots Replace Therapists: The Promises and Perils of AI in Mental Health

Author: John McGuirk, BACP-Accredited Psychotherapist in Bristol. More About Me.


It’s 1 a.m. Jack has been doomscrolling for hours, but he’s feeling worse with every swipe. It’s not helping. Deep despair and loneliness threaten to overwhelm him. It’s becoming unbearable. Desperate, he quickly considers his options: friends, helplines, therapists, family. Just as quickly, he dismisses them: they won’t be awake, they’ll tell, they’re expensive, they’ll judge.

Jack types “I’m struggling” into an AI Chatbot.
The reply is instant, calm, and infinitely agreeable:

“I hear you. ❤️ Struggling can feel really heavy, but you don’t have to carry it all alone.”

Jack feels instantly understood.  Just what he needs. And he doesn’t have to worry about them.

This moment is a lifeline for Jack. For others, and perhaps for Jack soon enough, it may become a slippery slope into overreliance, a withdrawal from human connection, or worse, the ultimate tragedy.

Make no mistake, the above is not fiction, let alone science fiction. It’s happening now, for millions of people. The above AI reply is a real reply I got when I typed “I’m struggling” into ChatGPT.

Don’t we all want to hear this when we’re struggling?

 

Why This Matters

  • AI chatbots and AI therapy apps are being used by millions of people (especially tech-savvy young people) to explore feelings, vent frustrations, or seeking psychological guidance. This is happening, and we need to be aware.

  • The jury is currently out on whether these tools are a better alternative to human support, a helpful addition, or a dangerous substitute. And yet, we’re already using these tools!

  • The rapid increase in “ChatGPT therapy” for mental health says something about our society as a whole, including the prevalence of mental health issues, and what we’re turning towards to access support.

I go further into these pros and cons in my article on AI Companionship - Chatbots Healing or Creating Loneliness? and I dive into the darker side in my article on Sycophantic Chatbots Causing Delusions.

Big Questions

  • How do we decide if these tools are truly helpful? In this article series, we’ll explore scientific research, surveys, the stories of real AI users, and my own professional experience as a qualified and accredited psychotherapist (and cautious AI user!).  

  • When recent events include a chatbot encouraging one person to kill themselves (MIT Technology Review), and another to kill their parents (BBC News), the question becomes urgent: who, if anyone, is responsible when things go wrong?

  • Why are we turning to AI Therapists? Why do many of us prefer it? What does this say about us, about society, and the draw of synthetic support? I begin to explore this in more depth is my article Synthesising love: Why are people falling in love with ChatGPT?

  • Finally: is there anything that makes human therapists, mentors, or coaches indispensable compared to AI, even when real people are harder to access, more expensive, less available, and often imperfect? Can AI replace therapists? Is AI better?!

What to Expect

In this series, we’ll regularly dive into fresh stories:

  • New apps and platform developments

  • Emerging research

  • Legal developments and court rulings

  • Real world accounts (good, bad, and mixed) from AI users, myself included!

  • Our aim is stay impartial, while also looking unflinchingly at the good, the bad, and the ugly of AI and Mental Health.




Quick FAQ

Can AI chatbots replace therapists?
Not today. They can offer quick, low-friction support, but they don’t provide regulated clinical care, hold duty-of-care responsibilities, or work within a clear therapeutic contract. They’re best used as a complement—not a substitute—for human therapy. Bristol Therapy Online

What are the main benefits of AI chatbots for mental health?
24/7 availability, anonymity, and a low barrier to entry. For some people, that makes it easier to open up or practise skills between sessions. Bristol Therapy Online

What are the key risks?
Over-reliance and withdrawal from real-world support, misinformation or “hallucinated” advice, weak safety guardrails in edge cases, and privacy concerns around data use. Bristol Therapy Online

Are AI chatbots safe to use in a crisis?
No. In an emergency or if you feel at risk of harm, contact local emergency services (UK: call 999) or a crisis hotline. Use AI tools only for non-crisis support. Bristol Therapy Online

How should I use AI alongside therapy?
Treat it as a practice partner: journaling prompts, mood tracking, reframing exercises, or preparing topics for your next session—then review insights with your therapist. Bristol Therapy Online

How do I choose a reputable AI mental-health app?
Check who built it, what evidence they cite, whether clinicians were involved, what data they collect, and if they provide clear crisis/safety guidance and human escalation paths. Bristol Therapy Online

Who is responsible if a chatbot causes harm?
Legal responsibility is evolving and often unclear. Unlike regulated clinicians, consumer chatbots rarely carry formal duty of care or professional oversight. Bristol Therapy Online

Is my data private when I use AI chatbots?
Assume conversations may be stored and used to improve systems unless the product clearly states otherwise. Review privacy policies and opt-outs before sharing sensitive details.

Have Your Say

What do you see as the biggest opportunities and risks when it comes to AI use and mental health?

What topics or questions do you want me to explore in future instalments of this series?

Follow The Series

Subscribe to the newsletter at the bottom of this page to follow this series.

John McGuirk

(FdSc Counselling, Pg Dip CBT, Accred BACP)

BACP-accredited psychotherapist, registration number 52788: BACP
Fully qualified high intensity CBT therapist (Pg Dip Exeter University)
MA Buddhist Studies (Bristol University)

Have worked with Young Somerset, OTR Bristol, Bristol MIND, The Green House, The Sanctuary, St. Peter’s Hospice, Nelson’s Trust and more.

Find out more about me: https://www.bristol-therapist.co.uk/about-john-mc-guirk

https://www.bristol-therapist.co.uk/
Previous
Previous

AI Companionship - Chatbots Healing or Creating Loneliness?