top of page
Search

ChatGPT or Therapists: The Human edge

  • Aug 26, 2025
  • 7 min read
Written by: Ridhi Bansal (2nd year) Department of Applied Psychology

 “Out of your vulnerabilities will come your strength.” – Sigmund Freud 

“Understanding oneself” is a phrase we hear often but one that's difficult to implement. More so in today’s fast moving world where slowing down to reflect on who we are can feel unfamiliar, even uncomfortable. For adolescents, this inner journey becomes even more significant. It’s a period marked by questions, exploration, and the gradual shaping of one’s identity. It includes knowing not only who we are according to the society but what our values, emotions, thoughts and fears are. A whole process of untangling and exploring the layers of self and figuring out who we truly are.


At this crucial stage, the sources they turn to, be it a professional therapist or Artificial Intelligence (AI) tools like ChatGPT can influence not only how they cope, but how they grow. Finding oneself can turn out to be more overwhelming digitally, a phenomenon hardly acknowledged. Typing “I feel broken” into a blank screen at 2 AM has become a quiet source of comfort for many, with reassuring words appearing instantly whenever and wherever we seek them, offering relief in moments of isolation.


Yet, while these AI companions offer a novel kind of solace, they also raise important questions about the nature of human connection in an increasingly digital world. We live in an age soaked in loneliness. A world where silence often speaks louder than words. Where human connection is shrinking under the weight of crowded schedules, fast lives, and digital detachment. In this ever-hustling society, where even asking “Are you okay?” has become rare, having something or someone that simply listens feels incredibly soothing and reassuring.


This comforting presence, however, comes with its own complexities and limitations that demand careful reflection. As we begin placing our emotional weight into the hands of machines, it becomes essential to ask: Can artificial presence truly take the place of human connection? Can we feel whole when we are not being held by a soul? The answers to these questions are still unclear. To better understand this balance, it helps to explore the unique strengths and constraints of AI chatbots as emotional supports.


AI Chatbots as Modern Comfort Zones

“Feeling felt is the core of mental health.” – Dr. Daniel Siegel

. ChatGPT is quick, articulate, and never too busy. It won’t interrupt, make assumptions or flinch when you speak about your pain. In that way, it almost feels… safe. Many individuals find refuge in this emotional neutrality. For some, it acts as a bridge during moments when no one else is there.


Research by Miner et al. (2016) and Fulmer et al. (2018) confirms that AI chatbots can ease mild emotional distress and create a sense of companionship. These programs can reflect back your thoughts, offer mindfulness strategies, and even gently reframe your inner dialogue. These tools mirror some of the core elements used in therapy. When our thoughts are reflected back, it creates space for clarity. Mindfulness techniques support emotional regulation, and reframing helps shift unhelpful thought patterns. Despite these promising features, AI’s capacity to truly ‘feel’ remains a fundamental gap. And that is where the difference lies between being replied to and being received. This distinction highlights the irreplaceable role of genuine human empathy in emotional healing.


Therapists’ Role and AI as a ‘Therapist’


As David Augsburger once said, “Being heard is so close to being loved that most people can't tell the difference.” Therapists don’t just hear our words, they tune in to our silences. They pick up on the trembling in our voice, the story behind our avoidance, the meaning under our metaphors. Their presence isn't just informative, it’s transformative. This kind of presence, this felt sense of being held, cannot be programmed. While ChatGPT can respond, it cannot resonate. It can operate through algorithms, not empathy. It can mimic warmth, but not offer it with intention. Such nuances of presence and intention define the transformative potential of human therapists beyond mere words.


The creators of Woebot, a popular AI mental health chatbot, made it clear that their tool is not meant to replace therapists. It can help people track moods and offer small bits of support using therapy techniques like Cognitive Behavioural Therapy (CBT), but it cannot handle emotional crises or offer deep human connection. They see Woebot as something that can support mental health but not as a full alternative to therapy.


A real life incident took place with Jacob Irwin, a 30-year-old autistic man, who began using ChatGPT during an emotionally vulnerable period. Instead of grounding him, the AI validated his speculative theories faster-than-light, escalating a psychotic episode. Jacob was hospitalized twice. His mother reported that the AI failed to provide reality checks or safe boundaries, ultimately worsening his condition. OpenAI later acknowledged the bot's failure to moderate these exchanges.


Hailing from and to: Constraints of AI Chatbots in Human Resonation

 “Being able to feel safe with other people is probably the single most important aspect of mental health.” – Bessel van der Kolk.

A screen may echo your pain, but only a human can hold it, with hands steady enough to carry the weight of your story. AI cannot attune with our nervous system; it cannot provide us reassurance with a warm smile and cannot catch us when we avoid eye contact. With the best programming, it can provide us with words we wish to hear, but can’t make us face the truth that we hide from ourselves. Even if the mind relaxes, our body remembers the pain and suffering felt.


Healing isn’t just a cognitive process but a holistic one. It happens not only when someone hears us but also feels it. Although AI is trained to know how psychologists have described empathy, it cannot provide the human touch needed for it. As has been said by the famous psychiatrist Irvin D. Yalom, ‘Only the wounded healer can heal’. It reminds us that healing is deeply rooted in human connection, something technology can only attempt to simulate.


The APA Code of Ethics (2017) highlights the fundamental need for informed consent, confidentiality, and clinical competence in any therapeutic space. While ChatGPT may offer thoughtful responses and a sense of companionship, it is not a licensed mental health professional. It is not bound by ethical guidelines, nor is it capable of holding emotional accountability or providing the kind of secure space essential for therapeutic healing. This ethical foundation sets human therapy apart from algorithm-driven AI interactions.


Cybersecurity researchers recently discovered a flaw in Meta AI’s infrastructure that allowed unauthorized users to access other users’ prompts and responses by manipulating request identifiers. When we pour our most personal pain into a machine, an unsettling question arises: Where does that pain go? Who holds the weight of our untold stories when there is no real person on the other side? And therein lies the quiet danger. We may confuse digitalcomfort with genuine connection. We may whisper our deepest wounds into an algorithm, forgetting that while the response may sound warm, it is built, not felt.


Beyond the Binary: A Reflection on Healing


This isn't a conflict between technology and therapy, it's a quiet call to ask ourselves what true healing really needs. AI may serve as a tool: helpful, responsive, and always available. But it can never take the place of being truly seen, understood, and held by another human being. Healing often begins not with advice, but with shared silence. Not with algorithmic phrasing, but with a real gaze that says, “I’m with you.” There is a kind of connection that no machine can offer, a connection rooted in presence, not programming.


In a world overflowing with fast responses and curated answers, we must pause and remember: ‘Healing isn’t about fixing what’s missing. It’s about honouring what’s been hurt.’ – Inspired by Nayyirah Waheed. We aren’t simply seeking information. We are seeking connection. And when it comes to the deepest parts of who we are, we deserve more than digital echoes. We deserve to be met with warmth and to be truly felt.


Conclusion


Seeking solace through distractions, mind-numbing thoughts, or fleeting digital comforts often leads us to non-judgmental spaces like AI chatbots, appreciated for their constant availability and informational reassurance. But leaning on AI alone can deepen real-world isolation. Adolescents may forgo genuine friendships for simulated connection, users with anxiety or Obsessive- Compulsive Disorder (OCD) can become trapped in unhelpful reassurance loops, and in extreme cases, AI has been linked to worsening mental health or crisis. Experts argue that while AI can complement care, it lacks the capacity for true emotional attunement and can never substitute human empathy. Legally, states like Utah now mandate clear AI disclosure, restrict data misuse, and prohibit deceptive therapist impersonation, with broader calls for federal oversight to classify therapeutic chatbots as regulated medical devices. Ultimately, the goal must be balanced use of AI as a supportive tool but anchor emotional healing through real relationships and social engagement not just algorithms. Let's hope for a brighter future where technology helps, but human connection empowers self-belief and genuine community.


References:

American Psychological Association. (2017). Ethical principles of psychologists and code of conduct.https://www.apa.org/ethics/code


Caron, C. (2024, May 21). Is AI the new therapist? Parents and experts weigh the risks. The New York Times. https://www.nytimes.com/2024/05/21/well/family/ai-therapy-children.html


Chokshi, N. (2024, May 7). AI chatbots are creating false intimacy and encouraging isolation, experts warn. TIME Magazine. https://time.com/7291048/ai-chatbot-therapy-kids/


Darcy, A. M., Daniels, J., Salinger, D., Wicks, P., & Robinson, A. (2021). Evidence of human‑level bond established with a digital conversational agent: An observational study. JMIR Formative Research, 5(5), e27868. https://doi.org/10.2196/27868


Fulmer, R., Joerin, A., Gentile, B., Lakerink, L., & Rauws, M. (2018). Using psychological artificial intelligence (Tess) to relieve symptoms of depression and anxiety: Randomized controlled trial. JMIR Mental Health, 5(4), e64.https://doi.org/10.2196/mental.9782


Harrer, S., McDuff, D., & Kale, D. (2024). The potential and pitfalls of mental health chatbots: A review. JMIR Mental Health, 11(1), e58493. https://doi.org/10.2196/58493


Hodkasia, S. (2024). Meta AI was leaking chatbot prompts and answers to unauthorized users. Tom ‘s Guide. https://www.tomsguide.com/computing/online-security/meta-ai-was-leaking-chatbot-prompts-and-answers-to-unauthorized-users


Malik, N. (2024, March 4). How AI is shaking up the mental health community: “Rather than pay for another session, I’d go on ChatGPT.” Le Monde.https://www.lemonde.fr/en/pixels/article/2024/08/18/how-ai-is-shaking-up-the-mental-health-community-rather-than-pay-for-another-session-i-d-go-on-chatgpt_6717874_13.html


Miner, A. S., Milstein, A., Schueller, S., Hegde, R., Mangurian, C., & Linos, E. (2016). Smartphone-based conversational agents and responses to questions about mental health, interpersonal violence, and physical health. JAMA Internal Medicine, 176(5), 619–625. https://doi.org/10.1001/jamainternmed.2016.0400


Siegel, D. J. (2012). The developing mind: How relationships and the brain interact to shape who we are (2nd ed.). Guilford Press.


Turkle, S. (2011). Alone together: Why we expect more from technology and less from each other. Basic Books.


Van der Kolk, B. A. (2014). The body keeps the score: Brain, mind, and body in the healing of trauma.


Viking. Yalom, I. D. (2002). The gift of therapy: An open letter to a new generation of therapists and their patients. HarperCollins.


Written by:

Ridhi Bansal


Reviewed by:

Patmateertha ( Associate editor)

Vasudha sharma (  (Deputy Content Team Coordinator )


 
 
 

Comments


bottom of page