More people are turning to mental health AI chatbots. What could go wrong? – National Geographic

Chatbots replace talk therapy

The accessibility and scalability of digital platforms can significantly lower barriers to mental health care and make it available to a broader population, said Nicholas Jacobson, who researches the use of tech to enhance the assessment and treatment of anxiety and depression at Dartmouth College.

Swept up by a wave of Generative AI, tech companies have been quick to capitalize. Scores of new apps like WHOs digital health worker, Sarah offer automated counseling, where people can engage in cognitive behavioral therapy sessionsa psychotherapeutic treatment thats proven to assist users in identifying and changing negative thought patternswith an AI chatbot.

The arrival of AI, Jacobson adds, will enable adaptive interventions and allow healthcare providers to continuously monitor patients, anticipate when someone may need support, and deliver treatments to alleviate symptoms.

Its not anecdotal either: A systematic review of mental health chatbots found AI chatbots could dramatically cut down symptoms of depression and distress, at least in the short term. Another study used AI to analyze more than 20 million text conversations from real counseling sessions and successfully predicted patient satisfaction and clinical outcomes. Similarly, other studies have been able to detect early signs of major depressive disorder from unguarded facial expressions captured during routine phone unlocks and peoples typing patterns.

Most recently, Northwestern University researchers devised a way to identify suicidal behaviour and thoughts without psychiatric records or neural measures. Their AI model estimated self-harm likelihood in 92 out of 100 cases based on data from simple questionnaire responses and behavioral signals like ranking a random sequence of pictures on a seven-point like-to-dislike scale from 4,019 participants.

Two of the studys authors, Aggelos Katsaggelos and Shamal Lalvani expectonce the model clears clinical trialsspecialists to use it for support, such as scheduling patients depending on perceived urgency and eventually, roll out to the public in at-home settings.

But as was evident in Smiths experience, experts urge caution over treating tech solutions as the panacea since they lack the skill, training, and experience of human therapists, especially Generative AI, which can be unpredictable, make up information, and regurgitate biases.

When Richard Lewis, a Bristol-based counselor and psychotherapist, tried Woebota popular script-based mental health chatbot that can only be accessed via a partner healthcare providerto help a topic he was also exploring with his therapist, the bot failed to pick up on the issues nuances, suggested he "stick to the facts, while removing all the emotional content from his replies, and advised reframing his negative thoughts as a positive.

As a therapist, Lewis said, correcting or erasing emotions is the last thing I would want a client to feel and the last thing I would ever suggest.

Our job is to form a relationship that can hold difficult emotions, Lewis added, and feelings for our clients to make it easier for them to explore, integrate, or find meaning in them and ultimately know themselves better.

Read the rest here:

More people are turning to mental health AI chatbots. What could go wrong? - National Geographic

Related Posts

Comments are closed.