"That's one of the questions I would think - 'where would I be if I could?'"
And in 2023, the National Eating Disorder Association replaced its live helpline with a chatbot, but later had to suspend it over claims the bot was recommending calorie restriction.In April 2024 alone, nearly 426,000 mental health referrals were made in England - a rise of 40% in five years. An estimated one million people are also waiting to access mental health services, and private therapy can be prohibitively expensive (costs vary greatly, but the British Association for Counselling and Psychotherapy reports on average people spend £40 to £50 an hour).
At the same time, AI has revolutionised healthcare in many ways, including helping to screen, diagnose and triage patients. There is a huge spectrum of chatbots, and about 30 local NHS services now use one called Wysa.Experts express concerns about chatbots around potential biases and limitations, lack of safeguarding and the security of users' information. But some believe that if specialist human help is not easily available, chatbots can be a help. So with NHS mental health waitlists at record highs, are chatbots a possible solution?Character.ai and other bots such as Chat GPT are based on "large language models" of artificial intelligence. These are trained on vast amounts of data – whether that's websites, articles, books or blog posts - to predict the next word in a sequence. From here, they predict and generate human-like text and interactions.
The way mental health chatbots are created varies, but they can be trained in practices such as cognitive behavioural therapy, which helps users to explore how to reframe their thoughts and actions. They can also adapt to the end user's preferences and feedback.Hamed Haddadi, professor of human-centred systems at Imperial College London, likens these chatbots to an "inexperienced therapist", and points out that humans with decades of experience will be able to engage and "read" their patient based on many things, while bots are forced to go on text alone.
"They [therapists] look at various other clues from your clothes and your behaviour and your actions and the way you look and your body language and all of that. And it's very difficult to embed these things in chatbots."
Another potential problem, says Prof Haddadi, is that chatbots can be trained to keep you engaged, and to be supportive, "so even if you say harmful content, it will probably cooperate with you". This is sometimes referred to as a 'Yes Man' issue, in that they are often very agreeable.Russian gas is still piped to Europe in increasing quantities via Turkey: CREA's data shows that its volume rose by 26.77% in January and February 2025 over the same period in 2024.
Hungary and Slovakia are also still receiving Russian pipeline gas via Turkey.Despite the West's efforts, in 2024 Russian revenues from fossil fuels fell by a mere 5% compared with 2023, along with a similar 6% drop in the volumes of exports,
. Last year also saw a 6% increase in Russian revenues from crude oil exports, and a 9% year-on-year increase in revenues from pipeline gas.Russian estimates say gas exports to Europe