AI, NLG, and Machine Learning
Introducing AI Therapists
Would you trust an AI therapist? Mental health chatbots are becoming more and more common in the field of psychology.
By Adam Westin
March 12, 2020
According to research conducted by the National Alliance on Mental Illness, nearly 20 percent of US adults experience some form of mental illness and less than half of them receive treatment.
With suicide rates skyrocketing in this country in the past decade—so much so that suicide has been recognized as the second-leading cause of death among Americans between the ages of 10 and 34—it begs the question, what can be done?
AI for mental health
Enter mental health chatbots.
The invention of artificial intelligence (AI) therapy was in response to the unfortunate truth that stigma and judgment are among the top reasons that people often do not seek the help they need to cope with mental illness.
Many people fear that opting to regularly see a licensed professional means that they are sick—and that sick people are, in some way, “less than.”
Therefore, people continue to suffer in silence, neglecting their mental health and perpetuating their condition—all to prevent discomfort for those around them.
It’s an epidemic.
Need-based innovation
This issue has led to impressive innovation from technology developers across the country.
To make treatment more approachable and attainable, from a patient perspective, companies are creating mental health chatbots, which are essentially virtual therapists that offer inexpensive, convenient, and highly accessible treatment recommendations for people suffering from mild-to-moderate anxiety and depression.
These customizable treatments typically require an automated assessment of a person’s current mental state that results in a customized treatment plan and suggestions to improve their mental health through cognitive behavioral therapy (CBT).
CBT is a psychological exercise designed to retrain a person’s thought processes and to reverse patterns of doubt and rumination. Solutions range from brief recommendations for breathing and meditation exercises to in-depth, extensive therapy plans.
Virtual therapy
Connecting patients with effective therapy treatments and comprehensive care plans without human interaction is no easy feat, and therapy chatbot technology is relatively new. AI therapist technology is facing a few key issues that are inhibiting its success.
For example, therapy chatbots send automated check-in notifications to prompt the user to provide an update on their current mental status—often several times in regular intervals throughout the day.
This is so that over time the software will begin to learn patterns of behaviors and thought processes and can provide intelligent solutions.
These reminders, although intended to be helpful and to promote consistency, can lead to higher levels of anxiety if users are unable to log in to the app at that precise moment.
Additionally, several apps include open-response questions rather than preset suggested responses. This open format can be overwhelming and may result in users abandoning the app.
Removing human judgment
AI therapy chatbots have a propensity to sound, well, like a bot. Although this isn’t surprising, it’s problematic for patients who crave human interaction without human judgment.
As far as consistency, patients are much less likely to skip a scheduled appointment with a human being than one with a dismissible round of automated questions on their phone.
In spite of these hurdles, AI therapists represent a massive stride in the right direction for treating the growing mental health epidemic by essentially eliminating barriers between patients and treatment.
The industry is working through a process of continuous improvement and, though it has a long way to go, applications have seen notable success in treating mild-to-moderate cases of depression and anxiety. Mental health chatbot technology continues to improve, as more users adopt the technology and provide essential feedback.
The information provided in this article is for general informational purposes only information. The information is not a supplement to, and NOT a substitute for, the knowledge, skill, and judgment of qualified psychiatrists, psychologists, physicians and health care professionals.