Topeka, Kan. – As artificial intelligence (AI) becomes a bigger part of our daily lives, experts in mental health are warning against relying on it for support. A troubling case from California highlights these concerns: a family is suing, claiming that ChatGPT played a role in their son’s tragic suicide.
Dr. Abby Callis, a clinical psychologist at Stormont Vail Behavioral Health in Topeka, shares insight into the issue. She says that while AI can help reduce feelings of loneliness in the short term, over time, it may actually make loneliness worse. This is especially true for those who interact with voice-based chatbots.
Research supports her view. Studies indicate that the more we rely on AI, the less we engage our own critical thinking skills. “We start to trust it and follow its suggestions without thinking for ourselves, which is worrying,” Dr. Callis explains. Critical thinking helps us make informed decisions based on our unique experiences, something AI simply cannot mimic.
AI uses algorithms and the information we provide to generate responses. Unfortunately, this means it might offer advice that sounds good but doesn’t consider the full context of our situation. For example, an AI might identify the keyword “suicide” and direct someone to resources without any follow-up support. “It can make an initial connection, but it can’t provide the deeper understanding that humans can,” notes Dr. Callis.
In her practice, Dr. Callis emphasizes the difference between human interaction and AI. If a person asks about the tallest bridge in their city, she would explore the reason behind the question. In contrast, AI would simply give an answer. “If someone shares a struggle and gets praised for trying something challenging, AI misses the context of why that action might not have been the best choice,” she adds.
Dr. Callis reminds us that AI is not a substitute for human connection. “It lacks the ability to truly understand you,” she says. Some therapists may use AI tools for journaling or developing coping strategies, but these sessions should always be followed by in-person discussions to ensure a deeper understanding.
The debate around AI in mental health continues to grow. A 2023 survey from the American Psychological Association found that 70% of mental health professionals believe AI could help improve client outcomes, but caution must be exercised. While AI can be a helpful resource, it cannot replace the vital nuances of human interaction.
For those seeking mental health resources, visit wibw.com/HearMeSeeMe.
As we move forward, it’s essential to balance technology’s advantages while keeping our connections with others strong.
Source link
tyh,to your health,ai,artificial intelligence,mental health,chatgpt,chatbot,depression,suicide,awareness,stormont vail,behavioral health,abby callis

