AI not ready for prime-time use in mental health care, experts say

Although several uses of artificial intelligence in mental health are seeing some success, experts say it’s still unclear whether it can be used on a larger scale.

Therapists use AI to review large amounts of patient data, including family history, patient behaviors, and treatment response, to aid in diagnosis and treatment identification, as well as to select therapists who can best connect with individual patients, according to a paper published by a Swiss company. World Economic Forum.

A study by researchers at New York University showed that AI was useful in identifying post-traumatic stress disorder in veterans.

Mental health professionals use wearable devices, such as FitBits, to monitor sleep patterns, physical activity, and changes in heart rate and rhythm that are used to assess users’ mood and cognitive state . The devices alert patients and healthcare providers when interventions may be needed and help users change their behavior and seek help.

AI chat programs using natural language programming are used to examine therapist reports and notes, as well as conversations during patient interaction to look for useful patterns. Researchers hope to help therapists develop better relationships with patients and identify warning signs in patients’ choice of topics and words, the World Economic Forum reported.

With the success of AI comes the risk of misuse. The forum released comprehensive guidelines and potential AI implementation strategies.

The Global Governance Toolkit for Digital Mental Health: Building Trust in Disruptive Technologies for Mental Health recommends goals, standards, ethical considerations, governance structure, and ways to encourage new innovations.

The forum recognizes the current gaps and challenges for developing AI in mental health. The use of AI chat in therapy, for example, raises the question of whether the technology is optimized for consumers’ mental health outcomes or developers’ profitability, the toolkit’s authors said.

Who ensures that information about a person’s mental health is not used unscrupulously by advertising, insurance or criminal justice systems? the authors wrote. Such questions are troubling in light of the current regulatory structure.

A study by researchers at the University of California San Diego, La Jolla, warned that differences between traditional health care and mental health care created complications for AI systems.

While AI technology is becoming more prevalent in medicine for physical health applications, the mental health discipline has been slower to adopt AI, according to the study published in the medical journal Current Psychology Reports. Mental health practitioners are more hands-on and patient-centered in their clinical practice than most non-psychiatric practitioners, relying more on softer skills including building rapport with patients and observing directly from patients’ behaviors and emotions. Clinical data on mental health often comes in the form of subjective, qualitative patient reports and written notes.

While these researchers and others from the World Health Organization were optimistic that technology could address current gaps, the WHO report, Artificial Intelligence for Mental Health and Mental Illness: An Overview, concludes that it is too early to predict the future of AI in mental health care.

We found that the use of AI applications in mental health research is unbalanced and is primarily used to study depressive disorders, schizophrenia, and other psychotic disorders. This indicates a significant gap in our understanding of how they can be used to study other mental health problems, Dr Ledia Lazeri, regional advisor for mental health at WHO/Europe, wrote in the report.

In the article Is AI the Future of Mental Healthcare?, published in May, the European scientific journal TOPOI concluded:

The question of whether and to what extent AI should be adopted in mental health care cannot be answered. There is too much information missing about its potential benefits and potential harms. However, it would make sense to use AI to support mental health care delivery if there is good reason to believe that AI performs better or can significantly assist human therapists.

#ready #primetime #mental #health #care #experts
Image Source : www.richmondregister.com

Leave a Comment