A quiet forest path winding through tall trees, symbolizing the journey of personal growth in therapy.

Can AI be my Therapist ? 

Dr. Erin Jacklin

When large language models like ChatGPT became widely available, I had the same reaction many psychologists did:

This could be incredible for access!

Mental health care is expensive, and there are not enough therapists to meet the growing demand. Becoming a therapist requires years of expensive graduate school, unpaid internships, licensure exams, supervision, ongoing continuing education, and significant financial investment. Most therapists can sustainably see about 25–30 clients per week — that’s full-time emotional labor. On top of that, many therapists are small business owners navigating insurance billing, documentation, coordination of care, and administrative work, much of it unpaid. Therapy is expensive not because therapists are trying to take advantage of their clients — it’s expensive because the system is expensive to operate within.

So when AI chatbots appeared, capable of offering warmth, validation, and conversation 24 hours a day at little to no cost, it felt promising. We are living through an epidemic of loneliness which is directly feeding the rising rates of humans struggling with mental health conditions throughout the world.  What if everyone had access to a warm, encouraging, supportive voice in their pocket? Wouldn’t that be wonderful for human flourishing? 

Genuine care is not a limited resource, in fact the more you receive it, the more you have to share with others. Research has shown that people who experience more warm, encouraging supportive human relationships reliably become more empathetic, warm, and encouraging to other humans, creating a virtuous cycle where giving love and compassion begets more love and compassion. 

Why would having this warmth delivered by an AI be any different? 

Because warmth alone is not the same as a relationship, and growth requires more than unquestioning support, it requires being known, challenged, and responded to by another mind, genuinely.

The Attachment Economy

We are currently participating in a large-scale experiment: millions of people sharing their inner lives with systems that feel responsive and intimate.

The question is not whether AI can say supportive things. It can.

The question is: What happens when attachment systems are activated without a frame?

Human therapy is structured, boundaried, and time-limited for a reason.

Unbounded emotional availability changes the dynamics.

If AI systems are designed primarily for engagement, not growth, we risk amplifying dependency rather than fostering autonomy.

That does not mean we should panic.

It means we should be thoughtful.

Therapy Is Not Just “Talking to Someone Nice”

From the outside, therapy can look like a kind person listening and offering encouragement.

Behind the curtain, much more is happening.

A skilled therapist is constantly making micro-decisions:

  • When to validate and when to gently challenge
  • When to sit quietly and when to interrupt a pattern
  • When to offer perspective and when to help you discover your own
  • When to risk discomfort because your growth requires it
  • When to step back and let you think

Therapy is not designed to make you feel good all the time.

It’s designed to help you grow.

And growth often involves noticing blind spots, sitting with discomfort, and examining patterns that once protected you but now limit you.

And critically, therapy has a frame.

We meet at a set time.
For a limited duration.
Usually once a week.
We do not socialize outside the room.
We do not become 24-hour support systems.

These limits are not arbitrary.

They are therapeutic.

Why the Limits Matter

Therapy works in part because of attachment.

Clients develop a relationship with a “good enough” authority figure, someone stable, caring, boundaried, and real. Over time, something remarkable happens: the client begins to internalize that voice. Many clients tell me that after a while, they can “hear” what I might say to them in a challenging moment.

That’s not dependency.
That’s integration.

And from my perspective, that’s success.

The goal of therapy is not to create lifelong dependency.
The goal is for you to no longer need your therapist in the same way.

Unlimited availability can feel comforting. But when support is always external, there’s less opportunity to build that voice inside yourself.

Limits create growth.

A solitary tree standing in an open field, representing stability, boundaries, and inner strength.

The Power — and Risk — of 24/7 AI Support

AI isn’t a replacement for therapy — it’s a tool.

And the most powerful tool for change is still a genuine human relationship.

AI chatbots can offer something humans cannot:
Immediate, unlimited availability.

At first glance, that seems purely positive.

But here’s where design and incentives matter.

Therapy is structured around helping clients become more capable of facing the challenges in their lives outside of the therapy room, not infinitely dependent. AI systems, especially those optimized for engagement, have different incentives.

A system designed to:

  • Keep you coming back
  • Validate you unconditionally
  • Avoid disagreeing
  • Minimize friction

is not operating under therapeutic principles.

We have learned from decades of research and clinical experience:

Warmth without challenge reinforces blind spots.
Validation without reality-testing amplifies distortions.
Constant availability will foster dependency.

That doesn’t mean AI is inherently bad. 

It means design matters, especially when we’re talking about mental health at scale.

Where AI Can Be Helpful

I’ve had clients describe beneficial interactions with AI, and have had them myself. Using AI as a sounding board, asking for perspective, learning about communication strategies, generating ideas can be genuinely helpful. Because these systems are trained on enormous amounts of human writing, they can often produce solid, reasonable advice.

Used as a thought partner?
A brainstorming tool?
A way to clarify your own thinking?

All while being encouraging and supportive?

AI can be powerful.

Where I become cautious is when AI becomes an emotional support figure, without a clear frame, without limits, and without intentional challenge.

Something Humans Offer That AI Does Not

Therapy is not just about advice.

It is about relationship.

Real therapists:

  • Have histories
  • Have reactions
  • Get tired
  • Go on vacation
  • Get sick and cancel
  • Set boundaries
  • Sometimes frustrate you
  • Sometimes feel conflicted
  • Sometimes feel proud of you
  • Sometimes feel concerned
  • Sometimes are deeply moved

And those moments become part of the work.

If I end a session on time while you’re still activated, that may bring up anger or abandonment. Instead of avoiding that, we explore it.

If I notice myself feeling distance when you shift into performance mode, I may gently name it.

We examine what happens between us.

That’s not a glitch in therapy.
It’s the gold.

AI does not have interiority (an inner life).
It does not have genuine reactions.
It does not experience distance, closeness, conflict, or relief.

It simulates conversation.
It does not participate in relationship.

And therapy, at its core, is relational.

Where I Land

I still believe AI has enormous potential to improve access to support.

But I also believe that:

  • Emotional support without thoughtful challenge is incomplete
  • Unlimited availability is not always healthy
  • Attachment needs structure
  • Growth requires friction
  • Therapy works because of its limits, not in spite of them

If we want AI to genuinely support mental health, we will need to design it intentionally, with boundaries, with reality-testing, with challenge, and with a goal of fostering independence rather than dependency, and not by hijacking the attachment system. 

As a tool to learn mental health promoting skills, it could have potential benefits, as a replacement for a real human connection it poses real risks of harm to the attachment system. 

The safest and most effective place for AI is as a powerful tool for helping you think, learn and create, not as a therapist.

No matter how AI develops in the future, there remains something profoundly transformative about sitting across from a human being who genuinely holds you in mind and cares about your wellbeing, in a structured space, doing the slow, difficult, deeply relational work of becoming more fully yourself. A machine no matter how advanced, does not possess interiority, it cannot hold you in mind, it cannot care about you. 

Ready to get started?

If you’re curious about what real, human-centered therapy can offer, we have therapists with availability and would love to help. Reach out to schedule a consultation — no pressure, just a place to start.