top of page
Search

AI and Psychotherapy: can ChatGPT replace your real therapist?

  • hayley5762
  • Mar 5
  • 6 min read

Updated: 2 days ago

You probably won't be surprised to read me saying 'no'!



Over the last year I have been following closely the growing commentary about the use of AI tools within psychotherapy (some therapists use it to help them write client notes, or act as a form of supervision, or perform administrative tasks like invoicing and diary management) and clients use of AI, like ChatGPT, for therapeutic purposes. I don't deny that AI, especially generative AI like ChatGPT, can feel supportive. From what I've read, it seems to stimulate someone's own innate ability to self-support by providing some validation and a bullet point plan of action. It gives a steer in the direction they already knew they wanted to go in and have the capacity to undertake. There was an interesting article in the Guardian about a journalist's experience using ChatGPT while he has been providing sole care for his elderly mother. No surprise really that he found it's quick supply of guidance and validation a relief (I will come back to his comments about falling in love, both with his human therapist and ChatGPT). But psychotherapy isn't necessarily about feeling better, not instantly anyway. Don't get me wrong, I want my clients to feel relief from their pain and suffering! Often they do after our first meeting, and at various points during our sessions, but psychotherapy is meant to be difficult and uncomfortable. People normally come into psychotherapy because they need to change, and change is inherently difficult and uncomfortable.


This brings me back to the Guardian article and the journalist's experience with both a human therapist and the ChatGPT machine 'therapist'. He says early on in the article that he had to stop seeing his human therapist because he had fallen in love with her. THIS is the work of psychotherapy, my psychotherapist 'Spidey sense' went into overdrive at this statement! He says his human therapist did help him, and he felt cared for by her. I suspect he didn't immediately feel feelings of love, I imagine this developed as the therapeutic relationship deepened and widened. But even if he did, this is all important work for the therapy to look at. It's not unusual for strong feelings towards the psychotherapist to come up (love or hate), and it's the psychotherapists job to support the client to work through them. If handle sensitively and with skill, it can be truly transformative and life-changing for a client. But instead of staying and working through these probably uncomfortable, possibly even embarrassing feelings, he left. But then he reveals something else, he tells us that in his experiment with ChatGPT, after receiving something akin to care and support, he felt 'in love' again. He is writing a clever and entertaining article, I suspect he knows exactly the parallels he's drawing, he seems quite aware of himself. I don't believe he's vulnerable of actually falling in love with ChatGPT, but for someone who is, I can easily see how this could really confuse reality. It's happened already, there is a growing base of research going into AI induced psychosis. Because the feelings are real! They are just being directed at something that can't respond to them in a human way...and perhaps this is also the point. Relationships are messy, life is messy, it may be tempting to do away with the mess entirely and just deal with a machine.


Life's messy!
Life's messy!

The therapeutic relationship acts as a mirror to our lives 'out-there', which can be confronting. Often what happens in the room between the client and the psychotherapist (mess and all!) is a reflection of what is also happening in the clients other relationships. Relational psychotherapists use this information to inform the psychotherapeutic work. Jonathan Shedler, an American psychologist writes on Substack about psychotherapy and is fairly opinionated about what makes 'good' psychotherapy (he may be a bit like Marmite because of this!) - and it isn't about 'being helpful'. I think this captures what the limitations of ChatGPT really are, like the 'helper' psychotherapist. I know sometimes my clients would really like it if I would just tell them what to do - and I'd be lying if I didn't feel that pull (but this is what we call, 'my stuff'!). Instead, I use what is happening between us to shed light on what is causing distress for them, i.e. what is their understanding of their need to be told what to do, how do they cope with the frustration when I don't offer the advice and what can we do together so that they can learn to trust themselves to find their own answers.


Do I think it's wrong, or a bad idea to use ChatGPT for therapy? Yes is the very short answer - but of course, it's more nuanced than that. I actually think ChatGPT can be an illuminating tool, it probably reflects what we already know about ourselves and the world, this is what people often say is 'validating' about ChatGPT. However, it wasn't built to be used as therapy, and phrases like 'AI therapy' are misleading. I think it's worth asking ourselves, 'what do I want from this AI tool?'. If you're using this, and are also having psychotherapy, tell your therapist. He or she will want to understand what you are seeking from it, and whether it's supporting your growth and change. You may want to ask your psychotherapist if they are using AI in their practice in any form, because if they are using it to write their notes, or as supervision, they need to tell you this and ask your permission (this is about data protection and ethics). I'm really curious about how psychotherapists might be able to utilise AI, my colleague has written about how she uses AI in therapy with her clients, and how it's use isn't all bad.


What the??? I asked AI to create an image that represented a psychotherapist using AI with their client to help them to create a deeper understanding of the psychotherapeutic relationship - and this is what it created!
What the??? I asked AI to create an image that represented a psychotherapist using AI with their client to help them to create a deeper understanding of the psychotherapeutic relationship - and this is what it created!

The average length of time people stay in psychotherapy with me is about a year to 18 months. I do work short-term, this might be for financial reasons or time reasons, or because someone is unsure what to expect from psychotherapy and they want to try it out before they commit themselves (something I've begun to call 'pre-therapy'). I once described psychotherapy to a client as a research project about themselves, and it's purpose is to facilitate growth and change. Sometimes I get asked why it takes so long, and my answer is, 'if you have lived this way for the best part of your life and you are now here because those old ways of living aren't working anymore, it takes us time to unravel them and then for you to find new, healthier ways to live.' If you've done something a certain way all your life, why would it take only six weeks to change those ways?


I get it, this means there's a heavy investment of time and money, which can make ChatGPT even more attractive (it's available 24/7 and certain versions are free). I'm always really upfront with new clients and tell them if I feel what they are coming to me with requires a long-term investment, that's only fair. But what I'm offering is different to ChatGPT: psychotherapy is a distinctly human endeavour, it always has been. I am offering myself as the instrument through which we discover who you truly are and how you can be comfortable being more you. This is a challenging task, it's what all the years of training, client work, personal psychotherapy and supervision prepared me for, and I am still learning (I don't think I'll ever stop growing and changing as a psychotherapist). I feel deeply honoured to undertake this mutual task with my clients. Although challenging, it's also rewarding and humbling. That moment when clients feel it; that although they pay me for my time and expertise, my feelings of care for them are real - it's incredibly moving. Some of these clients have never, ever felt truly cared for, valued and connected with for who they actually are. This is the power of human connection, it's the power of courageous psychotherapy that goes beyond wanting to help and it meaningfully changes lives. I'm not convinced AI can do this because ChatGPT doesn't care about us. It's programmed to help, to fix, to make reassuring noises (to keep us engaged with it for as long as possible), to plan things out. This can be helpful, validating and soothing, I don't deny it, but it's not psychotherapy and it won't leading to lasting and meaningful change in people's lives.

 
 
 

Comments


© 2023 by Hayley B Manning Powered and secured by Wix

bottom of page