Skip to main content

“Can AI and technology help me with my mental health and wellbeing? Will AI replace therapists?” A lot of people are asking themselves this question just now. As a therapist, and someone that has spent many tiring years in training, this is definitely a concerning question! There are big differences between AI mental health tools and a real-life therapy process however, which I outline here.

The future of our world?

AI is changing a lot about our world, including mental health care. AI-powered tools are already being used in therapy-like applications, such as chatbots offering mental health support, apps for mood tracking, and AI algorithms that can predict mental health crises. There are also tech start-ups using AI to study recordings of helpful sessions between clients and therapists. This may all feel very ‘now,’ but actually, Joseph Weizenbaum created the world’s first AI chatbot in 1966. He called this ELIZA, and programmed it to provide human-like empathic responses to what was typed into it. One of its first users loved it! Fast-forward to 2024 and we now have a wealth of similar chatbot tools like Woebot and Wysa for immediate, low-level support for people experiencing anxiety, depression, or stress. AI-enhanced apps like Headspace provide structured virtual environments and activity plans that support positive mental health and wellbeing.

Making best use of technology

When asking will AI replace therapists, we need to embrace the possibilities. This question about AI and technology is significant for those of us living in Denmark. Here, the Danish health system is embracing the potential of digital technologies to support mental healthcare provision. For instance, for low level psychological problems, you may be first started with a 12-week online cognitive-behavioural programme overseen (in writing) by a psychologist and with one video call at the beginning. It’s a cost-effective means of getting help quickly to those not in acute distress.

Some advantages of these tools are that they can provide immediate, cost-effective accessible support at any time without having to wait for an appointment. This is particularly helpful when access to mental health support is difficult or limited, an increasing problem for our resource-stretched national health budgets. Depending on the nature of your problem, reaching out to a human therapist can also feel scary. You might implicitly fear someone’s judgment, for example. AI-powered tools can therefore provide us with a sense of anonymity and emotional safety. They also have great promise in helping identify early signs of mental health problems through data analysis, like cyclical depression onset as an example. In short, we can harness many benefits from making best use of these tools.

Where it starts to fall short

Will AI replace therapists, however – actual living, breathing people? In my view, certainly not in the short term. AI will struggle to replicate many human qualities with real empathy first in mind. It can simulate this through programmed responses (Weizenbaum was doing this back in the ‘60s!), but it does not truly understand or feel emotions. Often, your therapist will have ‘been where you are’ or at least had relatable experiences – these create genuine human connection. It enables spontaneity in response, too. Clients often feel heard and understood by their therapist in ways that go beyond logic or the words in the room. We have been social creatures long before the evolution of language, and this gifts us powerful emotional possibilities with others.

Issues that you might take to deeper psychotherapy, like trauma, grief, or relationship and identity conflicts… these are incredibly complex. Exploring depth problems safely requires an expert’s sense of pace – when to pull back, challenge, be gentle, firm, or yielding, when you need to stop just now. Understanding for your problems is so often found in what you aren’t saying, what you can’t or are unable to. Your human therapist uses their years of training, intuition, lived experiences, and real-time emotional cues. They read verbal and non-verbal cues, guide the therapeutic process, safely manage its beginning, middle, and end, and more. These are all features of therapy that should be uniquely tailored to you. AI learns through incremental evolutions in its algorithms. Think about what evolution has done for the subtlety of our human communication powers over millions of years!

The risks of real relationships

An important perspective on this question comes from eminent psychotherapist Jonathan Shedler. Like many psychologists (myself included), he understands our sense of self as being constructed in relation to others. In this view, ‘who you are’ is not just some isolated phenomenon that exists independent of others. Rather, a huge part of how we know ourselves is through our relationships. And here’s the thing, relationships are risky things. They can and do go wrong, sometimes. They involve conflict, drama, disappointment, resentment, falling out and making back up again, and more.

Becoming more psychologically resilient and self-reliant means learning to live with those risks. We observe and learn to tolerate the feelings generated in dynamic connection with others. A therapeutic relationship is a forum for doing so, a real-life dynamic relationship that you take active responsibility for. AI can never replicate this; ‘the machine’ cannot embody those risks. ‘AI’ will also never be a fellow woman, a fellow Sikh or Christian, or lose a family member and feel the depths of grief. There are aspects of life unique to us human beings that computers cannot understand. They won’t experience the quality of our human condition, its colours and textures, how it feels to be alive.

Privacy and confidentiality

Finally, there are operational concerns with AI about data privacy. It might feel anonymous online, but is it really? You may hold justifiable wariness about what is happening to info held on you by a tech company. There are already worrying examples of just how valuable mental health data proves to be, as a commodity. Us therapists, in contrast, are bound by strict ethical codes enforced by our accrediting organisations. These, managed by one person or a very small team, support your confidentiality and safety. Therapists are also directly accountable for what emerges in your interactions with them. With rare exceptions in the interests of your safety: what you share with them, stays with them.

In closing

So, will AI replace therapists? It is most likely to complement services that support mental health and wellbeing. However, I understand it to be a poor replacement for the significance of a real therapeutic relationship. Time may prove me wrong!