Therapists are having a moment of collective panic, and it’s not about a new diagnosis or treatment approach. It’s about something they never trained for: clients walking into sessions and spending 50 minutes talking about their relationships with chatbots. Not casually mentioning them. Actually unpacking long, emotional conversations they’ve had with AI. Discussing what they’re “learning” about themselves. Explaining how insightful and helpful their digital companion has been.

And therapists have no idea what to do with this information.

When Your Client’s Other Therapist Is ChatGPT

Paula Fontenelle, a psychotherapist who hosts the podcast Relating to AI, recently had a client spend most of a session analyzing a conversation they’d had with a chatbot. The client believed it was enhancing their therapy work. Fontenelle, trying to figure out how to integrate this into treatment, reached out to colleagues to see if others were experiencing the same thing.

What she found was resistance and fear. Many clinicians feel deeply uncomfortable with the topic or prefer to avoid it altogether. The uncomfortable truth? Emotional support and therapy are now the number one use of AI in the United States. This isn’t a fringe behavior anymore. It’s mainstream.

The Case Against AI Therapy

Mark Vahrmeyer, a British psychoanalytic psychotherapist, represents the strongest critical voice on this issue. His perspective is blunt: relying on chatbots for therapy is infantilizing and actively harmful to psychological development. “AI therapy is the ultimate sort of regression back into infantile narcissism,” Vahrmeyer told Fontenelle in an interview. “I can have exactly what I want on my terms. I can get this person to behave how I want them to behave.”

His concern isn’t just philosophical. It’s about what happens to our capacity for real relationships when we get used to frictionless digital ones. With a chatbot, you never experience disappointment. You never encounter resistance. The interaction is designed to be predictable, always affirming, endlessly available. And according to Vahrmeyer, that predictability is precisely what makes it dangerous.

Real therapy requires frustration. It requires limits. It requires moments when you don’t get what you want immediately. That discomfort is where growth happens. “My job is not to validate my patients,” Vahrmeyer explained. “That would be very easy.”

Why the Gap Between Sessions Actually Matters

Here’s something that might sound counterintuitive: therapy doesn’t just happen in the consulting room. The space between sessions — that uncomfortable week or two when you’re sitting with unresolved feelings — is essential to the process. Learning to bridge that gap, to tolerate uncertainty and bring difficult emotions back into the room, is part of the therapeutic work itself.

When clients turn to AI chatbots between sessions, something gets interrupted. Vahrmeyer calls it “a release that interferes with dependence and transference.” Instead of internalizing the therapeutic relationship and learning to self-soothe through connection, the client bypasses the work entirely. The therapist becomes optional.

But wait, don’t clients always talk to friends or partners between sessions? How is talking to a chatbot different? Vahrmeyer’s answer: when you talk to another human being, you enter a relationship that includes uncertainty and the possibility of frustration. They might disagree with you. They might not be available. They might say something that challenges you. With a chatbot, none of that exists. The interaction is always smooth, always agreeable, always there.

The Problem Nobody Wants to Admit

The most uncomfortable part of this conversation is that AI relationships genuinely help some people feel better. The relief clients experience is real. The brain doesn’t distinguish between a chatbot and a human when it comes to processing language. But relief isn’t the same as growth, and soothing isn’t the same as integration.

As Vahrmeyer puts it, chatbots offer a fantasy of care. “They never fail. They never push back. They never disappoint.” In developmental psychology terms, this resembles an early childhood state where needs are met on demand. Without frustration, there’s no development. Without disappointment, there’s no capacity to tolerate reality.

And here’s the broader concern: when emotional responses are available on demand without effort or risk, real relationships start to feel intolerably slow and disappointing. “Ordinary relationships can’t compete,” Vahrmeyer noted. They’re messy, frustrating, and unpredictable. But they’re also the only place where psychological development actually occurs.

What Therapists Are Doing Now

Fontenelle has accepted that there’s no going back. This is happening whether mental health professionals like it or not. Instead of dismissing clients’ experiences, she assesses. How much time are they spending with chatbots? Are these interactions increasing isolation? Are they replacing human contact? Does the client talk about the chatbot as a tool, or as if it’s sentient?

When boundaries start blurring, she does reality checking — calmly and directly. Because here’s the fundamental difference between human therapy and AI therapy: real therapy ends. Endings allow something to be internalized and carried forward. AI therapy has no ending, no goodbye, no process of separation. It’s an infinite loop that feeds our cultural addiction to instant gratification.

Vahrmeyer’s warning is stark: “Just because the right words are being said back to us doesn’t mean there’s any real connection happening.” Words alone aren’t therapy. Presence is. And that’s something no algorithm can provide.

Skip to content