A professional headshot of Linnea Sepe-Forrest, a graduate student at Indiana University Bloomington, smiling outdoors with a blurred background.
Home » Posts » Chatbots: The Future of Psychotherapy?

Chatbots: The Future of Psychotherapy?

by Linnea Sepe-Forrest, Indiana University Bloomington

In an ideal world, there would be enough therapists to serve everyone’s needs across the world. Therapists would not experience any burnout or require high compensation for their services. While this sounds like an impossible feat, many companies are attempting to address unmet therapeutic needs by providing psychotherapy through artificial intelligence (AI) devices known as chatbots. A chatbot uses natural language processing to interpret messages and respond to clients, thereby simulating human conversation (Dale, 2016). You have likely encountered many while shopping online, changing travel reservations, or submitting requests for technical assistance. Although chatbots have been primarily used in business settings, they are now being employed within the mental health field to provide various forms of psychotherapy, including cognitive behavioral therapy (CBT), motivational interviewing, and motivational enhancement therapy (Bendig, Erb, Schulze-Thuesing, & Baumeister, 2019; Dilmegani, 2021; Fitzpatrick, Darcy, & Vierhile, 2017; Park et al., 2019). 

Chatbots have the potential to overcome barriers in providing mental health care by increasing accessibility in multiple ways. In addition to delivering therapy on a near infinite scale, chatbots provide the option of anonymity. This may be especially important for reducing barriers globally as mental health is heavily stigmatized in certain regions of the world (Lovejoy, 2019). Research has shown that individuals who are hesitant to share their concerns with another human being may feel more comfortable communicating with a robot (Oracle & Intelligence, 2021). Studies conducted using the CBT-based application “Woebot” found that, despite client’s awareness they were speaking with a robot, messages within the chat reflected bonding similar to a traditional therapeutic alliance (Bickmore, Gruber, & Picard, 2005). Furthermore, prior studies indicate that users have been more willing to exhibit deeper self-disclosure to a “more self-disclosing chatbot” and to create logs with the chatbot that can later be shared with their mental health provider (Lee, Yamashita, & Huang, 2020). For this reason, many argue that chatbots would be most successful as a tool that is paired with therapy (Kretzschmar et al., 2019). 


Although chatbots have the capability to improve many aspects of traditional therapy, there are hazards that both developers and users may need to consider, especially prior to replacing human-delivered psychotherapy treatments. Many argue these tools are being implemented prematurely as AI cannot currently understand the subtleties of human language and interaction (Kretzschmar et al., 2019). For example, one client complained that the application Woebot responded to a complaint about an unappreciative boss by saying, “That sounds difficult. Does this usually happen in the morning or night?” (Brown, 2021). These inadequate robotic responses have the potential to make someone who is already in a vulnerable state feel worse. In fact, some of the primary complaints that have been cited across many studies are the chatbots’ inability to understand the user’s responses resulting in shallow conversations (Abd-Alrazaq et al., 2021). Furthermore, the lack of sophistication may be especially dangerous if used as the primary intervention in high-risk situations. Although many applications are programmed to identify suicidal language and connect individuals with human counselors, computer programs may misinterpret nuanced language or fail to detect physical cues that an individual is at risk (Miner et al., 2016). 

The ability for chatbots to effectively replace or complement traditional therapy will depend on each user’s specific needs and technological advancements in the AI field. If these programs become advanced enough to form therapeutic alliances and deliver evidence-based therapy on an infinite scale, this could entirely transform the mental health field. These devices have the potential to increase the efficiency of clinical practice by reallocating certain duties to chatbots. This could change the training for future psychologists as they may learn how and when to transfer certain tasks, such as conveying basic information, to chatbots while spending most face-to-face therapeutic hours focused on addressing the client’s individual needs. Despite these advantages, critics argue that increased precision-based care through algorithmic technology may create additional hurdles, such as racial and gender biases that will be difficult to monitor without human intervention (Brown, 2021; Schlesinger, O’Hara, & Taylor, 2018). Therapists who prescribe chatbots may therefore be liable for these behaviors and responsible for ensuring the treatment delivered is consistent with the same ethics code that dictates their practice. Therefore, it is important to tread with caution while continuing to expand the capabilities of technological applications that may improve the global mental health crisis. 

________________________________________________________________

References

Abd-Alrazaq, A. A., Alajlani, M., Ali, N., Denecke, K., Bewick, B. M., & Househ, M. (2021). Perceptions and Opinions of Patients About Mental Health Chatbots: Scoping Review. J Med Internet Res, 23(1), e17828. doi:10.2196/17828

Bendig, E., Erb, B., Schulze-Thuesing, L., & Baumeister, H. (2019). Die nächste Generation: Chatbots in der klinischen Psychologie und Psychotherapie zur Förderung mentaler Gesundheit – Ein Scoping-Review. Verhaltenstherapie, 29(4), 266-280. doi:10.1159/000499492

Bickmore T., Gruber A., Picard R. (2005) Establishing the computer-patient working alliance in automated health behavior change interventions. Patient Education Counsel, 59(1), 21-30. doi: 10.1016/j.pec.2004.09.008

Brown, K. (2021). Something Bothering You? Tell It to Woebot. The New York Times 

Dale, R. (2016). The return of the chatbots. Natural Language Engineering, 22(5), 811-817. doi:10.1017/S1351324916000243

Dilmegani, C. (2021, 2021). Top 36 Chatbot Applications / Usecases in 2021: in-Depth Guide. Retrieved from https://research.aimultiple.com/chatbot-applications/

Fitzpatrick, K. K., Darcy, A., & Vierhile, M. (2017). Delivering Cognitive Behavior Therapy to Young Adults With Symptoms of Depression and Anxiety Using a Fully Automated Conversational Agent (Woebot): A Randomized Controlled Trial. JMIR Ment Health, 4(2), e19. doi:10.2196/mental.7785

Kretzschmar, K., Tyroll, H., Pavarini, G., Manzini, A., Singh, I., & NeurOx Young People’s Advisory, G. (2019). Can Your Phone Be Your Therapist? Young People’s Ethical Perspectives on the Use of Fully Automated Conversational Agents (Chatbots) in Mental Health Support. Biomed Inform Insights, 11, 1178222619829083. doi:10.1177/1178222619829083

Lee, Y., Yamashita, N., & Huang, Y. (2020). Designing a Chatbot as a Mediator for Promoting Deep Self-Disclosure to a Real Mental Health Professional. Proceedings of the ACM on Human-Computer Interaction, 4(CSCW1), 1-27. doi:https://doi.org/10.1145/3392836

Lovejoy, C. A. (2019). Technology and mental health: The role of artificial intelligence. European Psychiatry, 55, 1-3. doi:10.1016/j.eurpsy.2018.08.004

Miner, A., Kuhn, E., Hoffman, J. E., Owen, J. E., Ruzek, J. I., & Taylor, C. B. (2016). Feasibility, acceptability, and potential efficacy of the PTSD Coach app: A pilot randomized controlled trial with community trauma survivors. Psychological Trauma: Theory, Research, Practice, and Policy, 8(3), 384–392. https://doi.org/10.1037/tra0000092

Oracle, & Intelligence, W. (2021). Mental Health at Work Requires Attention, Nuance, and Swift Action. AI@Work Study, 2. 

Park, S., Choi, J., Lee, S., Oh, C., Kim, C., La, S., . . . Suh, B. (2019). Designing a Chatbot for a Brief Motivational Interview on Stress Management: Qualitative Case Study. J Med Internet Res, 21(4), e12231. doi:10.2196/12231

Schlesinger, A., O’Hara, K. P., & Taylor, A. S. (2018). Let’s Talk About Race: Identity, Chatbots, and AI. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (pp. Paper 315): Association for Computing Machinery.

Disclaimer: The views and opinions expressed in this newsletter are those of the authors alone and do not necessarily reflect the official policy or position of the Psychological Clinical Science Accreditation System (PCSAS).


Posted

in

by