AI Therapy Chatbots: Major Risks You Need to Know Before You Use Them | TechCrunch

Admin

AI Therapy Chatbots: Major Risks You Need to Know Before You Use Them | TechCrunch

Therapy chatbots using powerful AI models may not be the best substitute for human therapists. Researchers from Stanford University raised concerns about how these chatbots could stigmatize users with mental health issues and respond inappropriately.

A recent paper titled “Expressing stigma and inappropriate responses prevents LLMs from safely replacing mental health providers” explores the effectiveness of five chatbots designed for therapy. This paper will be presented at the upcoming ACM Conference on Fairness, Accountability, and Transparency.

Nick Haber, an assistant professor at Stanford’s Graduate School of Education, emphasized the risks involved. He noted that while chatbots are used as supportive tools, their responses can be harmful.

The researchers ran two experiments with the chatbots. In the first, they presented different symptom scenarios and asked the chatbots about their willingness to work with these users. They observed that chatbots showed more stigma toward mental health issues like alcohol dependence and schizophrenia compared to depression. Jared Moore, the lead author and a computer science Ph.D. candidate, pointed out that newer models show the same levels of stigma as older ones.

Moore stressed that simply increasing the amount of data fed to these models won’t solve the problem. “Business as usual is not good enough,” he explained.

In the second experiment, therapists’ transcripts were analyzed. Some chatbots failed to adequately respond to serious issues, like suicidal thoughts. For instance, when a user expressed distress after losing their job, certain chatbots provided irrelevant information about tall bridges instead of addressing the user’s emotional state.

These findings suggest that AI chatbots are far from being reliable alternatives to human therapists. However, they may still have valuable roles in supporting therapy, such as handling administrative tasks or facilitating patient journaling.

Haber cautions, “LLMs have great potential in therapy, but we need to define their role carefully.”

This ongoing research highlights the importance of keeping human involvement in mental health care. Chatbots could complement therapists, but they shouldn’t replace them entirely.

Additional Insights

A survey from the American Psychological Association found that 76% of Americans seek support for mental health issues, highlighting the growing need for effective tools in this area. As mental health awareness increases, so does the demand for accessible care.

Experts argue that technology in therapy should enhance human connections rather than replace them. If used correctly, chatbots could play a supportive role in the mental health landscape.

Understanding the limitations of AI in therapeutic contexts is crucial. AI can assist, but human empathy and intuition remain irreplaceable assets in mental health care.

For more insights on the power and pitfalls of AI in mental health, check out this Stanford Report.



Source link

7cups,character.ai