Views

A meeting of the minds: AI and employee mental health

Over the last few years, the shortage of mental health professionals has become a national crisis. As of March 2023, 160 million Americans live where the supply of mental health practitioners is less than half of what is needed. This has led to overburdened therapists, patient delays and frustration and a loss of confidence in a failing mental health system. Everyone deserves mental health care when they need it, not three months later, as is often the case.

Employee Assistance Programs (EAPs), the first stop for many workforce members grappling with emotional and work-life challenges, are seeking new resources to address this shortage. More recently, their quest has led to artificial intelligence. But to what extent can AI replace the functions performed by EAP counselors? This topic was explored at the 2023 Spring Conference of the National Behavioral Consortium (nbcgroup.org), a trade association of thought leaders from top-tier EAPs, behavioral health firms and partner companies.

The promise and pitfalls of AI
On the positive side, AI is already enabling EAPs to streamline operations, making it easier to find and recruit counselors, schedule appointments, research work-life issues, manage billing and greet new website visitors. In short, AI allows EAPs to work smarter, not harder. 

Read more: How to talk to your boss about your mental health

Some apps use AI algorithms to match individuals with therapists. But the process isn't  foolproof. For one, AI is not yet a master of nuance, which is critical when screening employees for a match. Experienced therapists skillfully navigate the sensitivities of matching clients and therapists based on personality, diversity and personal circumstances. This is particularly important if an initial match doesn't work out, as the individual may be more likely to give up on therapy entirely.

"AI engines cannot duplicate the human soul," says NBC member Susan Skinner, CEO of Personal Assistance Services (PAS) in St. Louis. "Nor can they accurately read emotions or express empathy. People who are distressed need a higher level of handholding." 

To avoid these problems, top-tier EAPs conduct initial assessments and follow up via email, text or phone. This personal touch sends the message that someone cares and is standing by and prepared to help. While AI platforms may have a similar "check-in" function, can an algorithm get it right without human involvement? Not likely — at least not today. 

"How can (A)I help?" — Enter the chatbot
Although AI may flounder at matchmaking, new AI-based chatbot solutions such as Woebot, Wysa and Pyx Health are supporting those with mental health struggles by helping build coping skills, reframing negative thinking, and serving as a "concerned friend" on call 24/7.

However, some chatbots are admittedly works-in-progress regarding crisis intervention and management. In fact, they warn users of their limitations and promote other channels of help. Chatbot developers are also testing the effectiveness of AI for conditions like post-partum depression, as well as training AI for cultural differences in language and personality. 

Read more: Wellthy launches teen helpline to help parents address mental health needs

Moreover, AI solutions can be beneficial to therapists themselves, not only as a complement to live therapy, but also as a tool for researching diagnoses, conditions, resources, and patient symptoms. In addition, therapists can turn to chatbots for their own support when dealing with stress, burnout, substance issues and other personal challenges. 

These benefits and others were discussed during the NBC meeting, where members analyzed chatbot pros and cons. In the plus column? Chatbots are immediately available around the clock and can offer short-term emotional support when waiting for a therapist. They are also low- or no-cost and may be used indefinitely. Most important of all, as a stopgap measure or adjunct to therapy, they can be lifesavers. 

Nevertheless, in serious mental health cases, chatbots alone aren't enough. They require oversight. They are also less apt than a trained EAP counselor to detect emotional subtleties (and the associated dangers). As stand-alone solutions, neither chatbots nor therapists work for everyone. Therefore, a combination of the two is recommended. 

Teaching AI to learn
To expand its knowledge base, AI must continually draw on enormous amounts of data and take time to learn. This is particularly relevant when AI is identifying and addressing subtle, intricate mental health problems.

Ethically speaking, these data sets should be gathered from public sources, not from unaware or non-consenting users. In any therapy, creating a safe space for patients is crucial. Users of mental health services must feel that their information is protected and secure, and that its sharing is under their control. This is one reason people may still be reluctant to have AI replace human intervention. 

Read more: Mental health 'microtraining' gives employees the skills to be resilient

This reticence is apt to weaken, however, as security measures tighten and people become more comfortable with AI as part of daily life. Consequently, the use of AI "therapists" that are sensitive, secure, and well trained may be met with less resistance in the future. 

Smarter together
In uniting the benefits of AI with the knowledge and experience of mental health professionals, the EAP industry is making great strides toward freeing up therapists, minimizing patient frustration, addressing non-critical mental health issues and decreasing the impact of the provider shortage. It is important for employers, employees, therapists, and EAPs to stay vigilant about the potential perils of AI while optimizing the promise inherent in its ability to help.

For reprint and licensing requests for this article, click here.
Mental Health EAPs Technology
MORE FROM EMPLOYEE BENEFIT NEWS