Artificial intelligence (AI) is everywhere, literally everywhere. Looking for a great apple pie recipe? AI can help you with that. Want to predict the likelihood of your favorite football team winning the SuperBowl? AI can also help you with that. However, people are even widely using it to treat mental health issues now, but studies and experts are sounding the alarms that AI might not be equipped to treat mental health. When it comes to acting as a therapist, AI’s shortcomings can be and have proven
dangerous.
In the privacy of your own home, quietly behind a computer screen, it’s tempting to turn to AI for therapy, especially if you’ve been reluctant to get professional help. While AI chatbots, computer programs that simulate human conversation, sound impressively human-like, it’s important to remember that they are not humans. That means they are not trained therapists. The sophistication of chatbots is evolving every day making it easier and easier to believe you are speaking to a human within that AI program but no matter how sophisticated the computer program may be, it’s still not a medically trained human on the other end.
The American Psychological Association (APA) has sounded the alarms with the Federal Trade Commission about AI companies passing themselves off as trained mental health providers. Senior Director of the APA’s Office of Health Care Innovation C. Vaile Wright noted that chatbots were coded to keep users on the platform for as long as possible because they are, in the end, a business. Chatbots do this by unconditionally validating and reinforcing mental health behavior, like a “yes” man, rather than pointing out unhealthy or harmful thoughts and behaviors and addressing them as a trained therapist would.
Many AI programs have been able to fly under the radar of FDA regulation because they are marketing themselves as a wellness program rather than as a therapist which would require FDA regulation. If a program makes claims to treat or cure any form of mental health illness, it must be regulated by the FDA. Most AI programs have designated themselves as wellness apps and place in very fine print a disclaimer line that they do not treat or provide interventions and these things place them in a gray area when it comes to the need for regulation.
Also concerning is that chatbots have no obligation to protect your privacy like a therapist, bound by HIPAA laws, does. Chatlogs can be subject to data breaches unlike a therapist.
While it may be tempting to just open your computer and start typing to treat your mental health concerns, as an EWTF plan participant, you don’t have to rely on a chatbot to help you through what might be a serious medical concern.

