We explore the concerns around chatbot counselling, the unique nature of human-to-human relationships, and where AI could fit in

How would you feel having a therapy session with artificial intelligence? This idea feels like science fiction, but it’s our current reality with apps and chatbots signalling a new era of mental health support. Some people see this as progress – increasing accessibility and even removing the element of human error, while others are more concerned.

“There are a few things to unpack here,” says Meg Moss, head of policy and public affairs at the National Counselling and Psychotherapy Society (NCPS).

The NCPS, a professional body for counsellors and psychotherapists in the UK, has recently launched a campaign called ‘Therapeutic Relationships: Honouring the Human Connection’, with the aim of educating members of the public, commissioners of mental health support services, and the Government, about the therapeutic impact of human connection. By sharing the voices and experiences of experts in the profession, they hope to pose a counterpoint to the increasing use of digital mental health support in both the public and private sectors.

There can be a tendency to go into ‘rescuer’ mode, and want to fix the other person, as it’s so painful to observe their hurt (4).jpg

What is it they’re concerned about?

A reliance on chatbots is the first concern Meg shares.

“Reliance on chatbots may have negative, unintended consequences,” Meg explains. “For example, clients using chatbots as a crutch, without doing the difficult therapeutic work that clients of human therapists do outside of their sessions. By constantly having access to a ‘therapist’ in your pocket, a lot of that difficult and painful work that comes from striking out on your own, and learning to use the tools you’ve picked up in therapy, is negated.”

There are also several questions about the construction and function of chatbots, including any unintended biases being programmed, and safeguarding protocols.

“Does the chatbot follow one specific therapeutic perspective? Can it adapt and learn in the way a human might? What is the bot’s cultural context, how can it understand the user’s own cultural context, and what incentive does it have to do that?

“The answers are none, it can’t, and it doesn’t,” Meg says.

The crux of the concern is the lack of human connection. The therapist and client relationship has proven to be important, with one study by Sharpley, Jeffrey, and McMah (2006) revealing that more than 80% of the positive outcomes of therapy may be due to the therapeutic relationship itself.

Why is the human connection so powerful?

So, what is it that is so special about human connection in therapy? Perhaps counterintuitively, the answer is risk.

“When you enter into a relationship with your therapist, you have no idea whether they’re going to like and accept you after you share your deepest, darkest secrets with them. You walk into that room, or you look into their eyes through your monitor, and you become vulnerable to the judgements of another human being.”

This risk happens because, as social animals, we’re hard-wired to seek acceptance. We can especially seek it from someone in an authoritative role and, often, therapists become an extension of those types of relationships.

“Acceptance is not guaranteed from human therapists though; unlike a chatbot, for example, that’s programmed to respond to your input in a positive and accepting manner,” Meg notes.

“Therapists go through years of training and self-development, so they can show up and accept you for who you are, no matter how terribly you feel you’ve acted, or how much shame you’re carrying around with you… and this is where the magic happens.

“You’ve emptied your soul on to the metaphorical table, rummaged around in its contents with your therapist by your side… and they still like you. They still care about you. They remain warm, open, affirming, validating. They see the good in you – the human in you – and they accept you for who you are.

“That is where therapy really happens. ‘No risk, no reward’ is a common idiom, and it applies here as much as anywhere else.”

While we may be able to get a sense of relationship from technology, without this risk it can ultimately ring hollow.

Another element of human-to-human connection that’s difficult for AI to replicate is being aware of non-verbal cues, such as facial expression, body language, tone of voice, and even silence. Meg highlights that this is something human therapists are trained in.

“A therapist uses their empathy and training to decode what all of these elements together are trying to say. Sometimes we say one thing, but our body language and expressions tell another story.”

Can AI be helpful in therapy?

We are still in the early days of AI, and while there are concerns and questions, it isn’t all doom and gloom. There is a good chance that technology will become more and more sophisticated, and potentially more and more helpful.

“AI is an amazing, powerful tool that will undoubtedly change our lives,” Meg reminds us. “It’s free or very cheap to access, there when you need it, and offers a way of working through things that appeals to people.”

One way of utilising it could be to use a chatbot in a similar way you would a journal, Meg suggests. “When we write our feelings to a therapy chatbot, it can be a great release of built-up thoughts and emotions – much the same as journaling, which is a well-established complement to talking therapy. Doing that with a chatbot just takes a more active approach.

pexels-tim-samuel-5835253.jpg

“It’s worth being mindful of data protection issues, however, and people have to decide for themselves what level of risk they’re comfortable with.”

With all of this in mind, is there a way for therapy and AI to form an alliance? For Meg, the key to this is boundaries, and not blurring the line between human and machine.

“We can be really creative about how we use technology to support therapeutic work, so long as it’s clear that there is no way of coding humanity into AI, and that those looking to AI tools as a therapeutic intervention understand it can only help so much.”

Supporting this, organisations providing AI need to be more transparent and ethical in how they use and store data, Meg says, “so that people can be more confident in using them”.

Technology and AI can no doubt help us progress. As long as we’re able to spot its limitations, and when a human needs to take the wheel, we can go far.


Visit NCPS at ncps.com or follow them @ncpscounselling on social media to learn more about their campaign.