Skip to main content
Tags: trauma | care | woebot
OPINION

Can Chatbots Provide Trauma-Informed Survivor Care?

medical chatbot
(Irina Usmanova/Dreamstime.com)

Wendy L. Patrick By Saturday, 11 November 2023 05:09 AM EST Current | Bio | Archive

Confiding in Artificial Assistants

Most people are familiar with the use of website chatbots to assist consumers with questions that arise when navigating online, including activities like booking travel, researching issues, and other common virtual endeavors.

But . . . what about when an individual in crisis needs "someone" to talk to?

Researchers have already been actively exploring the use of artificial intelligence (AI) chatbots within digital mental health interventions.

There has also been a recent increase in both use and acceptance of chatbots to help survivors of abuse, disaster, or other calamities work through the painful consequences of trauma.

Confiding in an Artificial Confidant

Trauma victims experience a wide range of physical and emotional outcomes that can fortunately be improved through trauma-informed counseling and therapy within both an individual and group context.

Most people envision trauma-informed care as involving a face-to-face session or a compassionate support group meeting with a circular gathering of other survivors.

Yet a growing number of people are finding support from artificial counselors — to whom some of them feel more comfortable disclosing explosive, embarrassing, shameful, or other highly sensitive details.

Research explains:

Yang Cheng and Hua Jiang in a study: "AI‐powered Mental Health Chatbots" (2020) examined the motivations for using chatbot support after mass shootings.

They began by recognizing that despite the increasing number of mental health issues linked to mass‐shootings, most people in the United States might nonetheless still feel reluctant to seek assistance, either formally or through social support or clinic medical treatment.

They note that the use of artificial intelligence (AI) has sparked innovation in the realm of mental healthcare services as well as crisis and emergency management.

In their research, Cheng and Jiang (ibid.) surveyed 1,114 participants in the United States who used chatbot services from top healthcare companies, investigating how AI shapes individual motivation, communication, and engagement behavior designed to ameliorate issues related to mental health.

Unlike the sometimes long process of booking an office visit or finding a support group close enough to attend, they found that AI‐powered chatbots facilitate user interaction with virtual therapists instantly, anywhere, and are also able to provide new online methods of treating mental illness through "effective and tailored dialogues."

As an example, Cheng and Jiang (supra) discuss the automatic conversational chatbot "Woebot," which was found to significantly reduce symptoms of depression among college students, with more receptivity than traditional therapies.

They also noted that according to a social network analysis of a data set on Twitter as well as research findings from qualitative content analysis of expert interviews, chatbots are apparently able to be widely applied by emergency management organizations to interact with members of the public on social media platforms.

Within the context of mass shootings in the United States, Cheng and Jiang (supra) found that their survey results revealed that customer motivation to use chatbot services from top healthcare companies after mass shootings included: social presence, perceived enjoyment, browsing information, and media appeal of online tools powered by AI.

They additionally found that self‐gratification and protection motivations experienced by customers towards others similarly affected by mental illness had a positive impact on their own active communicative action.

Effectively Talking Through Trauma Requires a Human Touch

AI support can provide non-judgmental, easily-accessible trauma care in some circumstances traditionally addressed solely through in-office contact.

As we continue to follow the evolution of alternative therapies, we are reminded that we can never replace the importance of human contact, because AI can never replicate the chemistry, connection, and rapport developed through interpersonal contact.

However, because other people are not always immediately available to persons in crisis, we continue to explore how to use AI as supplemental assistance.

This article was originally published in Psychology Today, and is used with the permission of its author.

Wendy L. Patrick, JD, MDiv, Ph.D., is an award-winning career trial attorney and media commentator. She is host of "Live with Dr. Wendy" on KCBQ, and a daily guest on other media outlets, delivering a lively mix of flash, substance, and style. Read Dr. Wendy L. Patrick's Reports — More Here.

© 2025 Newsmax. All rights reserved.

Sign up for Newsmax’s Daily Newsletter

Receive breaking news and original analysis - sent right to your inbox.

(Optional for Local News)
Privacy: We never share your email address.

WendyLPatrick
AI support can provide non-judgmental, easily-accessible trauma care in some circumstances traditionally addressed solely through in-office contact. We are reminded that we can never replace the importance of human contact,
trauma, care, woebot
680
2023-09-11
Saturday, 11 November 2023 05:09 AM
Newsmax Media, Inc.
Join the Newsmax Community
Read and Post Comments
Please review Community Guidelines before posting a comment.
 

Interest-Based Advertising | Do not sell or share my personal information

Newsmax, Moneynews, Newsmax Health, and Independent. American. are registered trademarks of Newsmax Media, Inc. Newsmax TV, and Newsmax World are trademarks of Newsmax Media, Inc.

NEWSMAX.COM
America's News Page
© 2025 Newsmax Media, Inc.
All Rights Reserved
Download the Newsmax App
NEWSMAX.COM
America's News Page
© 2025 Newsmax Media, Inc.
All Rights Reserved