Can AI-Powered Therapy Bridge the Gap in Mental Health Care?
- LJ Cadogan
- 1 day ago
- 6 min read
Access to quality therapy is increasingly hard to come by. Shortages of trained mental health professionals, long waits, and an increasing demand means many people are struggling to access mental health services. High demand coupled with a lack of capacity to meet it means routes to traditional therapy are becoming harder to access, and a growing number of people are turning to AI-assisted therapy apps for interim support. I spoke to John Tench, Managing Director for Wysa, for an insight to the world of AI therapy.
“Wysa is designed as both a standalone self-help tool and a supplementary resource in
traditional therapy,” says John Tench, Wysa's Managing Director. “Our app provides immediate, anonymous support, making mental health resources more accessible. In the context of NHS Talking Therapies, Wysa is used as a way of making self referral into services more engaging, and is used as a self-help tool while people wait for therapy to start.”
I remember my first experience of counselling – it was at college. While it was definitely accessible, I remember feeling a great deal of shame at the thought of my friends finding out. I made detours to avoid anyone seeing me lurking in that particular corridor and piecing things together. I did eventually tell my friends, and they were incredibly supportive. But each time I left a session, I felt physically and emotionally drained. My subsequent experiences were much the same, though that isn’t to say unhelpful. I came to realise over time that it’s quite normal to feel that way after a therapy session.
In recent years, I’ve relied on the patient ears of my partner, siblings, and friends. When necessary, I empty my thoughts into a notebook. But lately, I’ve found myself needing something more – a spring clean for my mind, someone to talk to without worrying that I’m burdening the listener. The availability of AI-powered therapy apps like Wysa is hard to ignore. So I tapped the download button.

The app makes mental health resources more accessible by providing immediate, anonymous support. The support provided by Wysa is personalised, as it adapts to user interactions over time. The AI chatbot responds to emotions expressed by users. With continued engagement, Wysa refines its responses.
The app had a lot of features I liked, although certain words are flagged as indicating the user is potentially in distress. I found I had to modify my language to keep a conversation on course, otherwise it thought I was distressed and responded accordingly. This is because Wysa has implemented several measures to ensure users receive appropriate guidance during moments of crisis, including an SOS function “that directs users to local and national crisis care helplines, assists in creating personal safety plans, and offers grounding exercises,” Tench says. “The AI continuously monitors user interactions for signs of distress, such as mentions of self-harm or suicidal thoughts, and proactively guides users toward appropriate urgent care resources.”
A few days in, I felt that my mood had improved. I felt very calm, though I can’t say whether that came from knowing there was support available should I need it, rather than having to leave things to rattle around in my head. Later, I had an opportunity to test Wysa when I read something that bothered me. Because I used the phrase ‘it’s choked me up a bit’, Wysa once again asked if I was okay, and directed me to its SOS button. I assured Wysa I was fine, and opted to be guided through a breathing exercise despite not feeling I needed it at the time.

During the first weekend of using Wysa, I took an assessment to find out what my boredom type is: ‘calibrating boredom’. “This type of boredom feels unpleasant but you don’t actively look for ways out of it”, Wysa told me, and offered some methods to break the cycle. I was pleased to see that some of its suggestions were things I already turned to.
I noticed, during another particular session, that I achieved in ten minutes what would usually take an hour. The exchange with Wysa didn’t leave me feeling drained, and I had a little more time to process the session. All the things that made me squirm in a chair in a therapy setting were gone. “Research shows that people open up about their worries three times faster to Wysa than a human therapist, simply because they know the bot can’t judge them. This helps build a therapeutic bond very quickly,” Tench shares. There are some things AI can’t do though – “AI lacks the nuanced understanding and authentic empathy that human therapists provide. Human therapists can interpret complex emotions, offer deep relational support, and adapt therapeutic approaches in real-time based on verbal and non-verbal cues.”
Wysa doesn’t just rely on AI interactions to assess users. It also employs clinically validated tools like the Patient Health Questionnaire (PHQ-9) and the Generalized Anxiety Disorder Questionnaire (GAD-7). “Depending on the scores it will recommend to the user that they should seek further help via the pathways that are available in that context, which may be an EAP for employees, crisis helplines, NHS Talking Therapy support or Wysa’s in-house coaching team support,” Tench explains.
A few nights later, I decided to try another session, in which I was provided with a visualisation technique to help reduce anxiety. Afterwards, I was more relaxed, and made a note to try it the next time I couldn’t sleep. It was another tool in the kit. And so it is for others – 89% of users say “the app helps them feel better while they wait for more formal treatment”, Tench tells me when I ask whether their data has shown any trends in whether users of Wysa go on to seek traditional therapy.
This depends on the context of the organisations Wysa works with. “For the consumer app, we have not yet created a pathway into NHS talking therapies nationally, because of the way that services commission providers on a geographical basis. The majority of our UK based users have come to Wysa via the Talking Therapy pathway, as a self-help tool while they await human therapy, rather than the other way around. In this situation, the impact on their wellbeing is very positive.”
Wysa isn't the only digital mental health tool people are turning to. There's Youper, Woebot, and Replika, for example, not to mention all the other mental health and wellbeing apps available. And there's also ChatGPT. “Following the launch of ChatGPT, we observed a huge shift in user expectations, with more people wanting to direct conversations and ask AI specific mental health questions,” Tench says, when I ask if any changes were observed in how people engage with AI therapy over time. “Users increasingly expect our chatbot to be interactive and responsive while still following evidence-based techniques. To meet this expectation responsibly, we integrated generative AI within strict clinical guardrails, ensuring safety, predictability, and fairness. This remains our priority.”
Safety is paramount. I have turned to ChatGPT once or twice during the writing of this article, but using Wysa definitely feels safer – the presence of the SOS button helps. And while it aims to be helpful to users, ChatGPT is capable of generating misleading or inappropriate advice, which I feel is less likely with an app like Wysa. “We must always balance user expectations with rigorous ethical and safety standards. If the user experience doesn’t match expectation, people will turn to AI tools that are not intended for this purpose and don’t adhere to clinical safety principles,” Tench states. “We take a multi-faceted approach to research and development, making sure that we contribute to shaping the future of mental healthcare as well as keeping up with emerging findings and the latest research… We also work closely with our industry partners, universities, healthcare organisations and regulatory bodies to make sure the latest findings are embedded into Wysa’s interventions.”
The demand for mental health support shows no sign of slowing, and AI-powered therapy apps like Wysa offer a space that is immediate, accessible, and free of judgment. My experience with Wysa was positive, and the fact that I wasn’t speaking to a human made it easier to open up. I found it to be an incredibly useful tool that I will continue to use if and when I need it.
AI therapy, like traditional therapy, isn’t one-size-fits-all. In the event that one is waiting to access traditional therapy, apps like Wysa could act as a bridge, or as a supplement to ongoing therapy. Some may find it easier to engage with, while others may prefer the nuances of human connection. But for those who feel stuck, overwhelmed, or simply in need of a space to process their thoughts, AI-assisted therapy apps might be able to help.
With thanks to Wysa for supplying me with a demo version of the app to use, and for answering my questions.