Mental health chatbots can effectively engage people with depression, study shows

[ad_1]

Clinician scientists from Nanyang Technological University, Singapore (NTU Singapore) have found that mental health chatbots are able to effectively engage people with depression in empathetic conversations and assist in the treatment of their symptoms.

Chatbots or conversational agents are computer programs that simulate human conversations. They are increasingly used in healthcare, for example, to help manage mental health conditions such as depression and anxiety and for general well-being.

A 2021 survey by Woebot Health, one of the leading therapeutic chatbot companies in the US, found that 22 per cent of adults have used a mental health chatbot, with nearly half (47 per cent) saying they would be interested in using one if needed.

This study by doctors from NTU’s Lee Kong Chian School of Medicine (LKCMedicine) is among the first to analyze user-Chatbot dialogues to evaluate their effectiveness.

The researchers analyzed nine mental health chatbots from leading app stores, of which five had at least 500,000 downloads, to see whether they offered self-help for people with depression.

Nine mental health chatbots were included in the study, four of which, Marvin, Serenity, Woebot, 7 Cups, are free-to-use, while Happify, InnerHour, Wooper, Wysa and Tomo, required a subscription or one-time purchase to be used.

The chatbots were evaluated by the NTU research team through scripted user personas that were created to reflect different cultures, ages, and genders. The personas also presented behaviors that reflect varying degrees of depressive symptoms.

This study published in December in the peer-reviewed Journal of Affective Disorders found that all the chatbots engaged in empathetic and non-judgmental conversations with users and offered support and guidance through psychotherapeutic exercises commonly used by psychologists and counselors.

Through examination of the app interfaces and their privacy policy legal statements, the researchers observed that all the chatbots kept the confidentiality of the user’s personal information and did not transfer or store any of it. This information includes chat history, names, or addresses, which they might divulge during chat sessions.

Depression affects 264 million people globally and is undiagnosed and untreated in half of all cases, according to the World Health Organisation. In Singapore, the COVID-19 pandemic has led to an increase in mental health concerns, which include depression.

There are still a lot of stigmas surrounding mental health disorders and the COVID-19 pandemic has significantly increased the number of people affected by mental health issues. Worldwide, healthcare systems are struggling to cope with the increased demand for mental health services. Digital health tools, including chatbots, could assist in providing timely care to individuals who may be unwilling or unable to consult a healthcare provider. Through this study, we have shown how chatbots are being used and how they engage in therapeutic conversations.”


Professor Josip Car, Director of the Centre for Population Health Sciences at NTU’s LKCMedicine and Study Leader

Chatting up chatbots to test their effectiveness

Although international research has shown that chatbots could help people, previous studies have not evaluated the dialogues between chatbots and users.

The NTU team’s content analysis evaluated the quality and effectiveness of the chatbots’ responses and looked at the level of personalization, appropriateness in supporting self-management in users with depression, and how they conveyed empathy to users.

The study also monitored how the chatbots guided users to engage in or complete mood-boosting activities, how they monitored moods and managed suicide risks.

The researchers said that all the chatbots displayed a “coach-like” personality that is encouraging, nurturing, and motivating. However, their analysis showed that while chatbots could engage in empathetic conversations with users they were not able to deliver personalized advice. This in-depth analysis of the conversational flow may be useful to help app developers design future chatbots.

First author Dr Laura Martinengo, a research fellow from LKCMedicine said: “Chatbots are not yet able to provide personalized advice and do not ask enough personal questions – possibly to avoid breaching user anonymity. However, these chatbots could still be a useful alternative for individuals in need especially those who are not able to access medical help. For some people, it’s easier to talk to a machine than a human being.”

While chatbots may support the self-management of depression and other mental health disorders, the researchers said that further research is needed to improve chatbots for individuals at risk of suicide and to evaluate the long-term effectiveness of chatbot-led interventions for mental health.

The researchers will be conducting further studies to advance the scope, quality and safety of their research looking into the effectiveness of other digital methods for mental well-being.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *