JEA Q3 2019

Tech Trends

Chatbots: Opportunities for EAPs?

By Marina London, LCSW, CEAP

This past February, I came across an article in Fast Company by Ciara Byrne, “Could talking to a bot help you feel better?” It provided a good summary of the history and current state of the art of chatbots, as well as their promise. Not much about their downside.
To summarize the article’s key points: One of the world’s first chatbots was a therapist. Built in 1964, the program, called ELIZA, was designed to mimic techniques from Rogerian psychotherapy where the therapist prompts the patient to examine their own thoughts and feelings. ELIZA had no memory or understanding of the conversation. It merely searched for a keyword in the last sentence typed in by the interviewer and calculated an answer using a rule associated with the keyword. 

Here is an example:
Well, my boyfriend made me come here.
He says I’m depressed much of the time.
It’s true. I am unhappy.
I need some help. That much seems certain.

Even though Eliza’s responses seem almost comically Rogerian, many users became convinced that ELIZA understood them. Users would often demand to be permitted to converse with the system in private. This phenomenon became known as “the ELIZA effect.”

Today we are surrounded by chatbots and voice analysis apps, a growing number of which are geared toward improving how we feel. Aimed at users who suffer from anxiety, depression, bipolar disorder, PTSD, or simply from stress, chatbots claim to be able to identify the mood or condition of the user, and in many cases can also offer advice or suggest therapeutic exercises.

Chatbots for Substance Use Addicts
There are even chatbots for substance use disorders. In 2017, in response to the opioid epidemic, the media conglomerate Viacom launched a corporate responsibility initiative called “Listen,” which aimed to change the national addiction conversation. Kodi Foster, a senior vice president of data strategy at Viacom, thought that “Listen” could do more than make public service announcements.

Foster knew that messaging apps were continuously increasing in popularity and that therapy over text message had been shown to encourage people to share uncomfortable information. Research from the University of Southern California’s Institute for Creative Technologies, had found that U.S. veterans returning from Afghanistan were more willing to disclose symptoms of PTSD to a virtual interviewer.

So Foster partnered with a tech company called Stndby to build a chatbot for addicts and supporters, accessible via a website. The chatbot, which interacted with users via text message, was designed to detect long-term personality traits, measure short-term psychological states, and offer support and therapeutic exercises accordingly.

As with ELIZA, many users had emotional interactions with this chatbot and thanked it for its help. One participant struggling with domestic problems and opioid abuse even sent the bot photos of her vacation at Disneyland with her children. “Hey, I know you are not real but I just wanted to send these pictures of my family out at Disneyland having a great time,” the user told the bot. “I’m doing better now. Thank you.”

Rauws Unveils Tess
At EAPA’s 2018 Conference and EXPO, Michiel Rauws, the CEO of X2AI, demonstrated “Tess” a chatbot initially created to provide mental health support in parts of the world where professional counselors are in short supply or even nonexistent. Michiel now wants to partner with EAPs to offer “Tess” to employees and their dependents.

Tess is not designed to replace EAP counseling but rather to be offered as an adjunct service, whether to tide EAP clients over with support while they wait for their face-to-face sessions or even as they wait for a referral. Tess could even help employees self-identify as needing employee assistance services.

For all of the supposed benefits of mental health and counseling bots, critics have questioned their safety and point to a lack of regulation. Others have wondered if a reliance on bots and screens might deprive people of the benefits of real-life communication and connection. The concerns about connection coincide with a rise in loneliness. Recent research on the placebo effect suggests that the effect may actually be a biological response to an act of caring.

A recent study explains that human beings “evolved in an environment which did not require them to distinguish between authentic and simulated relationships.” So when people interact with a non-human listener, they may feel as though they are dealing with a sentient being who cares about them. Although Tess users are explicitly reminded that she is not a real person, they act as if she is.

At the end of the Fast Company article, the author notes: “In a society where people seek constant validation via social media, yet feel chronically lonely, can non-human listeners ease our sense of isolation and the problems that result from it, or could these listeners become the ultimate ‘online only’ friend, addressing our basic human need for connection and caring in a way that ultimately leaves us even more alone?”

Food for thought given that other forms of artificial reality are right around the corner.

Marina London is the Director of Communications for EAPA and author of iWebU, (,) a weekly blog for mental health and EA professionals who are challenged by social media and Internet technologies. She previously served as an executive for several national EAP and managed mental health care firms. She can be reached at


Journal of Employee Assistance Q1 2019 X2 AI Tess: Working with AI Technology Partners, by Michiel Rauws, MscBA; John Quick, PhD; Nancy Spangler, PhD, OTR/L & Angela Joerin, MS, LLP.

Breakout session at the 2018 EAP Conference and EXPO: “Artificial Intelligence: Delivering Affordable, On-Demand, and Evidence Based Support to Millions of Employees” parts 1 and 2 can be viewed here: