Will AI Replace Therapy?
There’s been a lot of justified hand-wringing about AI: environmental impact, intellectual property concerns, privacy, the erosion of critical thinking, and the simple fact that AI often gets things wrong (and with the utmost confidence that it is right!). I share many of those concerns. There are also genuinely interesting and useful applications of AI. I’m not going to write about any of that in depth here: there are experts writing on each of those topics who could explain the risks better than I can.
What I am interested in talking about is the growing practice of using ChatGPT (or your LLM of choice, including specialized LLMs attempting to mimic actual therapy) as a free or very cheap, easy-to-access, completely nonjudgmental 24/7 therapist.
It makes absolutely no sense to fault people for using the resources available when they are in distress, and hell, ChatGPT is so much more accessible than a real life, breathing therapist. It’s nearly impossible to find a therapist who works in a way that’s effective for a client’s concerns, is a good personality fit, has availability, and is affordable. I would consider this one of my field’s greatest failures. AI has stepped in to fill that gap. As long as the client is discerning, ChatGPT can offer some useful cognitive behavioral tools, insight, and suggestions for self-regulation.
To take things one layer deeper than that: it’s one of the most miraculous things about humans that when we encounter something new, we explore it and find all the ways it can benefit us.
I think we need to get clear about what psychotherapy actually is and is not. I said above that ChatGPT can give great “tools” and insight, but I would argue that knowing more information is not really sufficient for lasting change. Exercise or meditating or laughing with friends can have very powerful mental health benefits, but they’re not therapy, either.
Psychotherapy, as supported by decades of research, requires a relationship with another person who will not simply offer skills or positive regard but gently and wisely challenge the client to stay engaged in their personal experience. At this point, LLMs are not going to do that. They’re programmed to tell you what you want to hear, be endlessly helpful (so you keep using the service)- not challenge you to grow and explore.
Unfortunately, many therapists have long told clients what they wanted to hear and fostered comfort and dependence without meaningful change. My guess is that many people who are using AI as their “therapist” have had these very blah, uninspired and ineffective experiences with actual therapists. If we as a field are concerned about being “replaced,” well, I think we need to take this as a challenge to demonstrate the value of therapy beyond just “helping clients feel better” or offering skills/insight/accountability. There is depth and nuance to applying the art, not just science, of therapy, and we need to actually be applying it. Hourly. And yes, sometimes messily. Unlike a chatbot, that messiness is often part of the work.
At the end of the day, AI is going to be applied- by individuals and corporations- to address mental health concerns, whether or not clinicians approve. There’s no putting the toothpaste back in that tube. There’s a mindset I strive to cultivate in myself and my clients, in times of distress- disliking something won’t make it go away, so we need to be flexible in how we adapt.
Perhaps that adaptation is accommodation- welcoming AI tools as easily accessible adjuncts to the hard work of therapy, or having them teach some of the basic tools so that our clients can get right to the “meat” of things when they sit face-to-face with us. Or perhaps it’s differentiation: clarifying for ourselves, our clients, and the public just why connection with a real person is an absolutely essential ingredient to the growth and healing folks are seeking. Regardless, I think we have work to do: work that was overdue long before any of us ever heard of ChatGPT.
—
photo credit: Emily Niezgoda on Unsplash

