A.I. Counsellors? Thanks, I hate it.
- Scott
- Apr 11
- 3 min read

Artificial Intelligence is everywhere right now. It’s being quietly added to the applications we use every day, and the Prime Minister has announced he will be “mainlining” A.I., whatever that means, into our way of living in the U.K.
Very basically it can analyse large data sets and come up with a prediction as the most likely answer to a question based on what’s happened before.
The applications of this are quite wide. Generative A.I. can come up with passages of text, produce artwork (by copying existing artists), and even hold relatively convincing conversations.
So naturally there are already A.I. counselling applications out there (often presented as “wellness tools” to avoid regulation). Through a text conversation you can talk to an A.I. ‘counsellor’ whenever you like and it will respond.
These are marketed as a credible alternative to waiting to see a real counsellor or finding a counsellor who you feel matches you. No need to book an appointment. No problems if you need to reschedule. There are suggestions that this technology could be more widely applied to help reduce waiting times and costs to the NHS.
So what do I think about all this? Well, oh boy…
With the proviso that I think that the recognition that mental health support is vital and important to us all is a good thing…
I think that A.I. counsellors are a dreadful idea. An abomination even, if you want to be dramatic.
At its heart, counselling involves a human being connecting with another human being. Replacing that with something essentially doing an impression of a human, is deeply problematic at a fundamental level.
To become a registered counsellor, you have to undergo a lot of challenging training for years to develop the listening skills and emotional intelligence used in the counselling process. There’s a lot of personal reflection on our own life experiences and we bring all that to the counselling relationship.
An AI construct has none of that. There’s been no meaningful training. There’s no serious understanding of life. It’s never experienced a loss. It’s never been bullied. It’s never been scared.
And that’s before we even get to the issue of “hallucinations” where the A.I. counsellor makes credible sounding mistakes, or when it completely misses something and responds inappropriately. One example mentioned in a Guardian article last year had an A.I. responding to a client saying they wanted to jump off a cliff with it being pleased that they were looking after their mental health!
Professional standards is a major issue to be considered. A human counsellor is required by professional bodies to work with a Supervisor for additional support and to ensure that high counselling standards are maintained. A registered counsellor also has obligations regarding all client data and this is actively addressed at all times.
An A.I counsellor doesn’t have to do that and doesn’t have to follow professional standards. (Calling them “wellness products” is a way of avoiding that). The stromash with the takeover of ’23 and me’ has highlighted the issue of personal data when online organisations are put up for sale.
A virtual army of artificial counsellors might save the government money and it might on the surface cut NHS waiting times, but there’s something fundamentally wrong with attempting to replace empathy and human understanding so crucial to mental health recovery with a simulation. Humans need other living beings in our lives. People who care about them and who give a damn. A responsive database, no matter how sophisticated, can never replace that.
If you would like to work with a real human being on your mental health, you can complete the online submission form or contact me directly at: mckellarCBT@gmail.com
*The image accompanying this post is from the classic Doctor Who serial, 'The Robots of Death.' It coined the expression "robophobia" to describe the anxiety and sense of 'wrongness' humans feel around robots, who have no emotions or reassuring body language that we take for granted around other humans.