For people in crisis, finding the right support can be hard. Many use the internet as a first step in help-seeking to look up symptoms, find services or connect with people with similar experiences.
And, increasingly, they're turning to AI chatbots such as ChatGPT for emotional support.
Chatbots can offer reassurance and self-help tips, encouraging people to explore their feelings. They can be accessed instantly, removing common barriers to contact such as stigma, service availability, cost, language and cultural barriers.
We hear AI can also provide a reassuring space that feels private, safe and non-judgemental.
This instant access to support may be more critical than ever given the extremely long waiting lists for support. Last year (2025) more than half a million people were referred for NHS talking therapies but never reached the top of waiting lists to start treatment.
Are there any problems with AI help-seeking?
Although AI can enable self-help, many are using it as a form of therapy or a substitute for other sources of emotional support. This can lead to growing reliance on it for their mental wellbeing.
More worryingly, we know that chatbots can sometimes give harmful advice and encourage self-harm or suicide. Tragically, there have been cases where people have taken their own lives following advice provided by a chatbot.
Samaritans' Online Excellence Programme
It's clear that much more needs to be done ensure that AI chatbots are safe and effectively regulated. That’s why we're working with tech companies through our Online Excellence Programme to provide guidance on supporting users who experience self-harm and suicidal feelings.
We're also pushing government and Ofcom to ensure that AI chatbots are covered under the Online Safety Act.
Tips on using AI safely
If you already use an AI chatbot—or are thinking about using one—here are a few tips to keep in mind:
- Think about the practical ways it can help – for example, by helping you explore different support options, helping you to prepare what to say to a professional/someone you trust or suggesting self-care exercises such as grounding techniques.
- Know its limits – although AI can often feel supportive, it’s not specifically designed to support your mental health. It can’t replace the power of human connection in times of crisis and shouldn’t be seen as a replacement.
- Protect your privacy - avoid sharing personal information you wouldn’t want stored or processed.
- Reflect on the impact it’s having on you – is it helping and how? Are there things that you don’t find so helpful?
- Consider how reliant you are on it – if you feel like you’re becoming dependent on AI for support it might be worth pausing and exploring other options.
- Consider other sources of support - whether it's friends, family, helplines or local services. Is there anyone who you feel able to reach out to?
Samaritans are always here to listen
When you’re struggling, AI won’t be able to hear the crack in your voice, listen for the things you might want to say but can’t, or really understand what you’re going through. But our trained listeners can. They're here day and night to help you talk, in your own way, about what you’re going through.
We believe that human connection has the power to save lives, and we know just how important this can be in moments of crisis. Although AI can be helpful in some ways, it can never replace a real conversation with someone gives you space and time to focus on your feelings.