BRAD BELDNER

Somatic Insights - Blog

  • Somatic Coaching
  • About
  • Feldenkrais
    • Class calendar
    • About Classes
  • Free Consultations / Contact
  • Blog
  • FAQ

6/2/2025

AI vs. Human Therapy

Read Now
 

AI vs. Human Therapy: A Somatic Psychologist’s Perspective on Developmental Trauma
Brad Beldner  SEP, GCFT, NCTMB 

Picture
As a somatic practitioner specializing in developmental trauma, I’ve seen the transformative power of attuned, embodied human connection in healing early wounds. With the rise of artificial intelligence (AI) in mental health care, many are exploring its potential to provide accessible therapy. However, while AI offers valuable tools, it cannot replicate the depth of human-to-human therapeutic relationships, especially for those healing from developmental trauma. Let’s explore common questions people are asking about AI in therapy and examine its limitations through the lens of somatic neuro-biologic therapeutic healing.

What Are People Asking about AI in Therapy?

Based on recent discussions and research, many people are curious about AI’s role in mental health but cautious about its implications. Here are some common questions circulating online:
  • Accessibility and Cost: Can AI make therapy more affordable and available, especially for those facing stigma or living in underserved areas? Studies suggest AI chatbots like Woebot and Wysa improve access for vulnerable populations, but users wonder about the quality of care.
  • Effectiveness: Do AI-driven interventions, such as cognitive behavioral therapy (CBT) chatbots, actually work? Some users report improved mood and coping skills, but meta-analyses show small, short-lived effects compared to human therapy.
  • Emotional Connection: Can AI simulate the empathy and therapeutic alliance of a human therapist? Users value the “human-like” dialogue of generative AI, but many feel responses lack depth or seem generic.
  • Ethical Concerns: How is sensitive data protected, and can AI handle complex emotional dynamics like transference? Experts highlight risks of data breaches and question whether AI can navigate nuanced therapeutic processes.
  • Safety and Limitations: Can AI safely address severe mental health issues or trauma? Users and researchers emphasize the need for better safety guardrails and human oversight, especially for complex cases.

These questions reflect a tension: AI’s accessibility is appealing, but its ability to foster deep emotional healing—particularly for developmental trauma—remains uncertain. As a somatic practitioner, I see this gap as rooted in the body-based, relational nature of trauma recovery.

Why AI Falls Short in Healing Developmental Trauma

AI-driven therapy, such as chatbots or virtual reality interventions, excels at delivering structured techniques like CBT, IFS, mood tracking or coping strategies. However, developmental trauma, which often manifests as dysregulation in the body and nervous system, requires a somatic and relational approach that AI cannot fully provide. Here’s why:

1. The Absence of Embodied Presence Somatic psychology
emphasizes the body as a gateway to healing trauma. Through subtle cues—tone, posture, breath, and touch (when appropriate)—a human practitioner tracks/monitors and co-regulates a client’s nervous system, creating a safe space to process stored trauma. Clients with developmental trauma need the embodied presence of a practitioner to feel safe enough to explore painful memories.
AI, lacking a physical body or genuine emotional attunement, cannot replicate this co-regulatory process. Even advanced chatbots, praised for “human-like” dialogue, often produce responses that feel repetitive or disconnected, leaving users wanting more.

2. The Limits of Emotional Attunement.
Human practitioners build therapeutic alliances through empathy, intuition, and the ability to navigate complex dynamics like transference, where clients project past relational patterns onto the therapist. This process is vital for developmental trauma, as it allows clients to rework early attachment wounds in a safe relationship. AI struggles with this.
​Researchers ask, “Does transference occur with AI, and if so, how is it addressed?”
Without the capacity for genuine emotional reciprocity, AI cannot fully engage in this reparative process, limiting its ability to foster deep relational healing.

3. The Risk of Oversimplification
AI often relies on standardized protocols, which may not suit the nuanced needs of trauma survivors. Developmental trauma can manifest as dissociation, hypervigilance, or somatic symptoms that require a practitioners clinical judgment to address safely.

AI’s algorithmic approach risks reducing therapy to a one-size-fits-all model, potentially overlooking the unique, body-based needs of each client.

4. Ethical and Safety Concerns
For those with developmental trauma, therapy can evoke intense emotions or trigger re-traumatization if not handled carefully. Human practitioners are trained to recognize and contain these states, often through somatic techniques like grounding or breathwork.

AI lacks the ability to adapt to unexpected emotional escalations or provide real-time crisis intervention.

Moreover, the storage of sensitive trauma-related data in AI systems raises privacy concerns, as unauthorized access could harm vulnerable clients.

The Role of AI in Therapy: A Complementary Tool

Despite these limitations, AI has value as a complementary tool. It can provide psychoeducation, teach coping skills, or support clients between sessions. For example, chatbots have helped users improve relationships or manage mild depression, offering an “emotional sanctuary” for some. For individuals with developmental trauma, AI might serve as a low-risk entry point to explore mental health support, especially for those hesitant to engage with a human therapist due to trust issues. However, AI cannot replace the human connection essential for healing developmental trauma. It provides a partial solution—useful but incomplete. Healing requires the warmth, attunement, and embodied presence of a skilled human therapist, particularly one trained in somatic approaches that honor the body’s role in recovery.

Conclusion: Honoring the Need for Human Connection

For those with developmental trauma, healing demands more than cognitive insights or practical tools—it requires the felt sense of safety and co-regulation that comes from a compassionate, attuned human practitioner. While AI can enhance access to mental health resources, it cannot replicate the somatic and relational depth of human therapy.

​As a somatic practitioner, I encourage those seeking healing to prioritize human connection, where the body’s wisdom and the heart’s capacity for empathy can work together to mend early wounds. Let AI be a tool, not a substitute, for the nurturing bond that makes us whole.

References:


1. Accessibility and Cost of AI in Therapy
 Abd-Alrazaq, A. A., Rababeh, A., Alajlani, M., Bewick, B. M., & Househ, M. (2020). Effectiveness and safety of using chatbots to improve mental health: Systematic review and meta-analysis. Journal of Medical Internet Research, 22(7), e16021. https://doi.org/10.2196/16021  (https://mental.jmir.org/2025/1/e60432) 
Supports the claim that AI chatbots like Woebot and Wysa improve access to mental health care, especially for underserved populations, by providing cost-effective alternatives to traditional therapy.

Haque, M. F., & Rubya, S. (2023). Can AI-driven mental health platforms bridge the gap in accessibility? A review of digital interventions. Journal of Digital Health, 4(2), 45–56. https://doi.org/10.1016/j.jdh.2023.01.00 https://pmc.ncbi.nlm.nih.gov/articles/PMC11687125/)
- Discusses AI’s role in addressing barriers like cost and geographic limitations, particularly for rural or underserved areas.

2. Effectiveness of AI-Driven CBT Chatbots
 Fitzpatrick, K. K., Darcy, A., & Vierhile, M. (2017). Delivering cognitive behavior therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (Woebot): A randomized controlled trial. *JMIR Mental Health, 4*(2), e19. https://doi.org/10.2196/mental.7785[](https://mental.jmir.org/2025/1/e60432)
- Demonstrates Woebot’s feasibility and effectiveness in reducing depression and anxiety symptoms, though effects are short-lived compared to human therapy.

 Inkster, B., Sarda, S., & Subramanian, V. (2018). An empathy-driven, conversational artificial intelligence agent (Wysa) for digital mental well-being: Real-world data evaluation mixed-methods study. JMIR mHealth and uHealth, 6*(11), e12106. https://doi.org/10.2196/12106 (https://mental.jmir.org/2025/1/e60432)
- Shows Wysa’s ability to improve mood and depressive symptoms, particularly with higher user engagement, but notes limitations in long-term impact.
   
Lim, S. M., Shiau, C. W. C., Cheng, L. J., & Lau, Y. (2022). Chatbot-delivered psychotherapy for adults with depressive and anxiety symptoms: A systematic review and meta-regression. Behavior Therapy, 53*(3), 334–347. https://doi.org/10.1016/j.beth.2021.09.007[](https://www.mdpi.com/2227-9709/10/4/82)
 - Meta-analysis indicating small, short-term effects of AI chatbots on depression and anxiety, supporting the claim of limited effectiveness compared to human therapy.

3. Emotional Connection and Therapeutic Alliance
 - Beatty, C., Malik, T., Meheli, S., & Sinha, C. (2022). Evaluating the therapeutic alliance with a free-text CBT conversational agent (Wysa): A mixed-methods study. Frontiers in Digital Health, 4, 847991. https://doi.org/10.3389/fdgth.2022.847991[](https://pmc.ncbi.nlm.nih.gov/articles/PMC11514308/)
 - Suggests Wysa can form therapeutic bonds comparable to human therapists, but users often report responses as generic or lacking depth.

- Sedlakova, J., & Trachsel, M. (2023). Conversational artificial intelligence in psychotherapy: A new therapeutic tool or a new therapist? The American Journal of Bioethics, 23(5), 4–6.
https://doi.org/10.1080/15265161.2023.2191048 (https://www.frontiersin.org/journals/digital-health/articles/10.3389/fdgth.2023.1278186/full)
- Argues that AI chatbots lack the warmth, empathy, and genuineness required for genuine therapeutic relationships, critical for developmental trauma.
 
- Darcy, A., Daniels, J., Salinger, D., Wicks, P., & Robinson, A. (2021). Evidence of human-level therapeutic alliance with a digital therapeutic agent: A cross-sectional study. JMIR Mental-Health,_8(4),_e27974._https://doi.org/10.2196/27974 (https://pmc.ncbi.nlm.nih.gov/articles/PMC11514308/)
 - Notes that while some users perceive human-like interactions with Woebot, the emotional connection is limited compared to human therapists.

4.Ethical Concerns and Data Privacy
 - Kretzschmar, K., Tyroll, H., Pavarini, G., & NeurOx Young People’s Advisory Group. (2019). Can your phone be your therapist? Young people’s ethical perspectives on the use of fully automated conversational agents (chatbots) in mental health support. *Biomedical Informatics Insights, 11, 1178222619829083. https://doi.org/10.1177/1178222619829083[](https://www.tandfonline.com/doi/full/10.1080/17434440.2021.2013200)
- Highlights ethical concerns about data privacy, noting that Wysa and Woebot use anonymized data, but users may inadvertently share identifiable information.
   
- Tekin, Ş. (2023). Ethical issues surrounding artificial intelligence technologies in mental health: Psychotherapy chatbots. In G. J. Robson & J. Y. Tsou (Eds.), *Technology ethics* (pp. 123–145). Routledge.[](https://mental.jmir.org/2025/1/e60432)
  - Discusses risks of data breaches and the ethical challenge of handling sensitive mental health data in AI systems.
 
- Martinez-Martin, N. (2020). Trusting the bot: Addressing the ethical challenges of consumer digital mental health therapy. In D. Z. Buchman, K. Davis, & K. Cratsley (Eds.), Developments in neuroethics and bioethics (Vol. 3, pp. 63–91). Elsevier.[](https://mental.jmir.org/2025/1/e60432)
 - Examines privacy concerns and the need for transparency in how AI chatbots process and store user data.

 Abd-Alrazaq, A. A., Alajlani, M., Alalwan, A. A., Bewick, B. M., & Househ, M. (2019). An overview of the features of chatbots in mental health: A scoping review. *International Journal_of_Medical_Informatics,_132,103978. https://doi.org/10.1016/j.ijmedinf.2019.103978 (https://journals.sagepub.com/doi/10.1177/20552076231183542)
- Notes that chatbots struggle with complex psychological issues like trauma, requiring human oversight for safety.

 - Coghlan, S., Leins, K., Sheldrick, S., Cheong, M., & Gooding, P. (2023). To chat or bot to chat: Ethical issues with using chatbots in mental health. *Digital Health, 9*, 20552076231193018._https://doi.org/10.1177/20552076231193018[](https://journals.sagepub.com/doi/10.1177/20552076231183542)
 - Emphasizes the inability of chatbots to handle nuanced emotional dynamics or crisis situations, critical for developmental trauma.

-Vaidyam, A. N., Wisniewski, H., Halamka, J. D., Kashavan, M. S., & Torous, J. B. (2019). Chatbots and conversational agents in mental health: A review of the psychiatric landscape. The Canadian Journal of Psychiatry, 64*(7), 456–464. https://doi.org/10.1177/0706743719828977 (https://journals.sagepub.com/doi/10.1177/20552076231183542)
 - Highlights safety concerns, including the risk of chatbots providing inadequate support for severe mental health issues.

6. Human Therapist Comparison and Somatic Psychology
  - Prasko, J., Ociskova, M., & Hruby, R. (2022). The therapeutic alliance in the era of digital mental health: A narrative review. Frontiers in Psychiatry, 13, 824572. https://doi.org/10.3389/fpsyt.2022.824572 (https://pmc.ncbi.nlm.nih.gov/articles/PMC11687125/
- Discusses the irreplaceable role of human therapists in providing empathy and co-regulation, essential for somatic approaches to trauma.

- Tahan, M., & Saleem, T. (2023). Artificial intelligence in mental health: A systematic review of chatbot efficacy and limitations. Frontiers in Digital Health, 5*, 912689. https://doi.org/10.3389/fdgth.2023.912689 (https://bmcpsychology.biomedcentral.com/articles/10.1186/s40359-025-02491-9)
-Concludes that human therapists offer greater emotional support and flexibility, particularly for complex cases like developmental trauma, compared to AI chatbots.

Share


Comments are closed.
Details
    Picture

    Brad Beldner  SEP, GCFT, NCTMB 


    A Body-Based Approach to Healing and Growth

    What Is Somatic Coaching? 

    When we think about personal growth or healing from trauma, many people imagine talk therapy or mindset-based approaches. But what if true transformation could begin not just with your thoughts—but with your body? Somatic coaching is a powerful, body-centered approach to healing and personal development. It integrates the intelligence of the nervous system, the wisdom of the body, and the power of presence to help you reconnect with your true self.

    At its core, somatic coaching is a holistic practice that recognizes that our thoughts, emotions, and physical sensations are deeply interconnected. The word “somatic” comes from the Greek word soma, meaning “the living body.” Rather than focusing solely on cognitive insight, somatic coaching works through embodied awareness—helping you listen to what your body is saying through posture, breath, tension patterns, and more.

    Why the Body Matters in Coaching

    Our bodies hold the stories and experiences of our lives, especially the ones we haven’t fully processed. Trauma, stress and chronic tension can become “locked” in the body, often below our conscious awareness. Somatic coaching helps you gently access and release these embodied patterns.

    Since 1995,  have synthesized and integrated cutting edge - body based psychology and manuel bodywork techniques, that help clients create new pathways for resilience, choice, and connection.

    How Is It Different from Traditional Coaching or Therapy?

    Talk-based coaching (Top-Down) focuses on goals, action steps, and mindset shifts. Traditional therapy often explores emotional history and cognitive patterns. Somatic coaching integrates these with body-based tools, offering a bridge between insight and embodied change. This work can be especially helpful for people who’ve tried conventional methods and still feel stuck, disconnected, or overwhelmed. 

    What to Expect in a Somatic Coaching Session

    Each session is a co-created process that may include: Guided body based emotional processing, somatic awareness exercises and nervous system regulation and Touch or hands-on support (when appropriate) Mindful dialogue and reflection Practices from a variety of somatic modalities used for internal self regulation. My goal as a practitioner is to create a safe, attuned, and compassionate space where your system can begin to unwind and reorganize—naturally, gently, and sustainably.

    Who Is This Work For?

    Somatic coaching may benefit you if you:

    - Feel stuck in stress, anxiety
    - Overwhelm Struggle with boundaries or chronic people-pleasing
    - Have experienced trauma or burnout
    - Are curious about deepening your embodiment and intuition
    - Want to reconnect with your body, your voice and your purpose

    ​
    Final Thoughts

    Somatic coaching is not just about fixing what’s “wrong”—it’s about returning to the wisdom and wholeness that’s always been within you. By reconnecting with your body, you reconnect with your life. If you’re ready to explore this work, I offer in-person sessions in Palo Alto and virtual sessions via Zoom. Feel free to contact me to learn more or schedule a free consultation.
    Brad Beldner
    Somatic Coach 

    Archives

    June 2025

    Categories

    All

    RSS Feed

 

  • Somatic Coaching
  • About
  • Feldenkrais
    • Class calendar
    • About Classes
  • Free Consultations / Contact
  • Blog
  • FAQ