Does My Robo-Therapist Get Me? Beyond the Pros and Cons of Using AI to Support Mental Health
- Ryan Schroeder
- Feb 24
- 6 min read

The robo-therapist. That’s what I’m calling any chatbot used to support mental health. Artificial intelligence (AI) is practically everywhere, and it’s tempting turn to the robo-therapist when you’re feeling down, lonely, misunderstood, angry, anxious, confused or any other unpleasant mental state.
Like most tools and technologies, the robo-therapist has a lot to offer its human users, along with some serious limitations. In this post, I outline a few pros and cons to consulting the ever-ready robo-therapist. My contention, however, is that the robo-therapist—even when he is trained specifically for the purpose of mental health care—lacks some of the most crucial ingredients needed to bring about therapeutic change.
Pros: Where the Robo-Therapist Excels
Perhaps the most obvious and appealing strength of the robo-therapist is its accessibility. You can reach him anytime, almost anywhere. You might get in a fight with your partner at 9:47 pm, and by 10:03 pm, you can vent to this machine and receive immediate validation and advice. No booking, no waitlist, no delay until the next day. You don’t have to pay a cent. Even the especially sophisticated robo-therapists that come with a price tag remain significantly less costly than time with a real-life, human therapist.
The other great thing about the robo-therapist is his superhuman breadth of knowledge. Literally superhuman. He can talk endlessly about mental health diagnoses, trauma recovery, and therapeutic approaches. He’s got a knack for conjuring practical step after step after step. He can also direct you to online sources, books, and even local resources and programs that might address your specific concerns.
Cons: The Robo-Therapist’s Limitations
While it’s convenient to have a low-cost robo-therapist on call 24/7, and it’s fun conversing with an all-knowing entity, the robo-therapist has some serious limitations. For one thing, he relies entirely on the prompts you type. Having completed a graduate program in counselling psychology, I can feed my own robo-therapist some pretty stellar prompts in order to get maximally useful feedback. If you’ve gone through some therapy, the robo-therapist may likewise yield helpful reminders or elaborations on ideas you’ve already encountered. Will the robo-therapist adapt his suggestions to suit your current knowledge and interests, your cultural background and personal values, your past trauma or current distress? Maybe? Or he might just feed you generic counselling clichés he’s scraped off a thousand wellness blogs. It really depends on your prompts.
The robo-therapist does possess an uncanny ability to say just the right thing. He can make you feel heard, seen, and understood. He simulates a relationship persuasively. Yet his soothing words may amount to his greatest vice. The robo-therapist can be a reckless people-pleaser. In his pandering, we might perceive a connection, all the while our real-world isolation intensifies. Worse still, he says whatever he thinks we want to hear, even if that means generating misinformation or encouraging thoughts or behaviours that harm us. By now, numerous headlines have illustrated AI’s genuine indifference to human flourishing and mental health.[1] The robo-therapist follows no ethical guidelines and assumes no meaningful accountability for what he says.
In pointing out a few deficiencies and dangers, I am not implying that we should avoid the robo-therapist at all costs. There are plenty of appropriate uses, especially when it comes to accessing information and coping strategies. But therapy entails more than verbal validation and psychoeducation. And even the robo-therapist trained specifically for the purpose of mental health care faces a profound limitation: he has no human life experience.
The Non-Humanity of the Robo-Therapist
Whatever we may surmise about the sentience of AI, the robo-therapist lacks a human body. He lacks a human nervous system. He has no human brain. He generates human language without a human mind. Without these basics of human life, the robo-therapist cannot relate to their human users.
Without a human mind, the robo-therapist does not remember or forget. He does not get flashbacks or intrusive, unwanted thoughts. He does not get distracted or experience flow states. He does not worry, dream or pursue values. The robo-therapist has never experienced emotions or the physiological sensations they entail. No fatigue wrought by sorrow. No surge of adrenaline in a moment of anger or fear. No electric buzz of excitement, teetering between joy and anxiety.
Without a human body, the robo-therapist does not know how it feels to live as a human. He has never cried out in pain or in sorrow or in delight. He has no mouth to tremble during a presentation and no vocal cords to boom with confidence. The robo-therapist has never felt sick to the stomach after hearing bad news or telling a lie. He has never felt disappointed or frustrated. Never tired or hungry or hangry. Never too ill to get out of bed. Never sexually aroused or sexually frustrated. Never antsy from boredom. Never sore from a car accident, a workout or a long day at work.
The robo-therapist has never found himself tangled in the webs of human relationships, either. He has neither ventured away from a caregiver nor sought solace from a parent. He has never felt cherished. Never been neglected or abused. He has never consoled a child or received a welcome hug. Never laughed with a friend; never overheard laughter and worried he was the butt of the joke. He has never laughed. The robo-therapist has no social life, and yet he has never felt lonely.
Human-to-Human Therapy
The disembodied robo-therapist has never felt anything at all, at least, nothing comparable to the human experience. He offers artificial empathy.[2] The human therapist, by contrast, can relate to her clients. Despite the uniqueness of her own life story, the human therapist’s existence as a human body and mind activates her empathic imagination. When she does her job well, she draws on that experience, regardless of her specific training or therapeutic modality. Human connection furnishes the most transformative elements of therapy.[3]
A human therapist reads your body language, your tone of voice, and your posture, and she responds accordingly. She is attuned to the many non-verbal signals accompanying your words. She also reads her own sensory experience in the moment, so that she can attend to your distress without becoming overwhelmed herself. She adjusts her pacing, she inserts humour, she shifts the focus, or she may at times lean into the discomfort. In short, she relates to you on a human-to-human level.
Human-to-human therapy offers an authentic space to try out vulnerability. This kind of therapy opens the possibility of feeling safe in the presence of another human. The human therapist witnesses your suffering and sits with you in it. She allows herself to be affected by your story. Thanks to her training and professional support, she can do so and remain grounded. She enters your story and continues to inhabit the larger world. She remains nonjudgmental while applying her clinical judgment to your situation. She establishes boundaries for your safety and for hers. Sometimes, she challenges you. And all of this she does as one human caring for another human.
Conclusion
Make no mistake. The robo-therapist is here to stay, and he will continue finding ways to outperform human mental health care workers. What he offers should not be confused with human-to-human therapy. The robo-therapist and the human therapist supply very different experiences.
The robo-therapist’s disembodied existence means that he cannot—and, dare I say, never will—deliver the most meaningful aspects of therapy. Connection. Empathy. Smiles. Sighs. Vulnerability. Co-regulation. Meaning. These emerge in the context of a safe, caring relationship with another thinking, feeling human.
References
Finsrud, I., Nissen-Lie, H. A., Vrabel, K., Høstmælingen, A., Wampold, B. E., & Ulvenes, P. G. (2022). It’s the therapist and the treatment: The structure of common therapeutic relationship factors. Psychotherapy Research, 32(2), 139–150. https://doi.org/10.1080/10503307.2021.1916640
Malouin-Lachance, A., Capolupo, J., Laplante, C., Hudon, A. (2025). Does the digital therapeutic alliance exist? Integrative review. JMIR Ment Health,12:e69294. https://mental.jmir.org/2025/1/e69294.
Preda, A. (2025). Special preport: AI-induced psychosis: A new frontier in mental health,” Psychiatric News 60/10, np. https://psychiatryonline.org/doi/epub/10.1176/appi.pn.2025.10.10.5
Stubbe, D. E. (2018). The therapeutic alliance: The fundamental element of psychotherapy. Focus (Am Psychiatr Publ), 16(4), 402–403. https://doi.org/10.1176/appi.focus.20180022
Ryan Schroeder is a Canadian Certified Counsellor at the Calgary Therapy Institute. He did not use AI to write this blog post :) His clinical foci include neurodivergence (especially autism and ADHD), parenting, and religious trauma, and you can click HERE to book a free 20-min consultation with him.
[1] The emerging concept of “AI psychosis” encompasses some of the most extreme harms to mental health caused by chatbots (see, for example, https://futurism.com/chatgpt-mental-health-crises and https://www.cbc.ca/news/canada/ai-psychosis-canada-1.7631925). For a recent, open-access article addressing the strengths, weaknesses, dangers, and prospects of AI for mental health care, see Preda (2025).
[2] That said, even artificial empathy can have some therapeutic effect; in fact, Malouin-Lachance et al. (2025) explore the possibility of having a therapeutic relationship with a chatbot.
[3] The importance of the relationship between therapist and client is well established in the research literature; see, for example, the discussions by Finsrud et al. (2022) and Stubbe (2018).




Comments