Character.ai and other bots such as Chat GPT are based on "large language models" of artificial intelligence. These are trained on vast amounts of data – whether that's websites, articles, books or blog posts - to predict the next word in a sequence. From here, they predict and generate human-like text and interactions.
The way mental health chatbots are created varies, but they can be trained in practices such as cognitive behavioural therapy, which helps users to explore how to reframe their thoughts and actions. They can also adapt to the end user's preferences and feedback.
Hamed Haddadi, professor of human-centred systems at Imperial College London, likens these chatbots to an "inexperienced therapist", and points out that humans with decades of experience will be able to engage and "read" their patient based on many things, while bots are forced to go on text alone.
"They [therapists] look at various other clues from your clothes and your behaviour and your actions and the way you look and your body language and all of that. And it's very difficult to embed these things in chatbots."
Another potential problem, says Prof Haddadi, is that chatbots can be trained to keep you engaged, and to be supportive, "so even if you say harmful content, it will probably cooperate with you". This is sometimes referred to as a 'Yes Man' issue, in that they are often very agreeable.
And as with other forms of AI, biases can be inherent in the model because they reflect the prejudices of the data they are trained on.
Prof Haddadi points out counsellors and psychologists don't tend to keep transcripts from their patient interactions, so chatbots don't have many "real-life" sessions to train from. Therefore, he says they are not likely to have enough training data, and what they do access may have biases built into it which are highly situational.
"Based on where you get your training data from, your situation will completely change.
"Even in the restricted geographic area of London, a psychiatrist who is used to dealing with patients in Chelsea might really struggle to open a new office in Peckham dealing with those issues, because he or she just doesn't have enough training data with those users," he says.