IE 11 Not Supported

For optimal browsing, we recommend Chrome, Firefox or Safari browsers.

University of Colorado Pursues Role for AI in Psychiatry

Researchers are striving to apply machine learning to psychiatry through a speech-based mobile app that can categorize a person’s mental health status just as well, or even better, than a human clinician can.

(TNS) -- In the prevailing there’s-an-app-for-that culture, perhaps it should not be surprising that researchers are exploring machine learning that could bring artificial intelligence to the practice of psychiatric diagnosis.

Peter Foltz, a research professor at the Institute of Cognitive Science at the University of Colorado Boulder, is co-author on a new paper in Schizophrenia Bulletin that lays out the potential payoffs and possible pitfalls of AI in psychiatry.

And, with co-author Brita Elvevåg, a cognitive neuroscientist at the University of Tromsø, Norway, Foltz is striving to apply machine learning — a subset of AI — to psychiatry through a speech-based mobile app that can categorize a person’s mental health status just as well, or even better, than a human clinician can.

“The goal is not to replace what a clinician does,” Foltz said. “The goal is to be able to give them better tools to help access more information about their patients.”

Foltz said he and Elvevåg met a number of years ago at a conference where he was giving a talk about how to analyze coherence in language using technology, and she was doing research in schizophrenia at the National Institute of Mental Health.

She said, referencing Foltz’s work, “Hey, could this be applied to detecting issues of serious mental illness,” Foltz recalled. “And I said ‘I always wanted to do this, but I don’t have the data.’ And she said she had the data. Since then, we have collaborated.”

According to the National Alliance on Mental Health, 1 in 5 adults in the U.S. experiences mental illness each year. And for many of those people, access to a clinician might not be available, due to their living in a remote location or suffering some other barrier to accessible in-person care.

Also, when a patient does make it to a face-to-face appointment, therapists base their diagnosis and treatment plan largely on listening to a patient talk — which can be subjective and unreliable, according to Elvevåg.

“Humans are not perfect. They can get distracted and sometimes miss out on subtle speech cues and warning signs,” Elvevåg said in a statement. “Unfortunately, there is no objective blood test for mental health.”

Together, Foltz and Elvevåg are developing machine learning technology able to detect day-to-day changes in speech that can suggest mental health deterioration, such as disjointed speech or sentences that don’t track a logical pattern, which can be symptomatic for schizophrenia. Shifts in tone, or even pace, can suggest mania or depression.

“In our case, it’s learning how to characterize language and how people say things, when comparing the language of people with mental illness versus the language of people who don’t have mental illness,” Foltz said.

“One of the things we’re looking at is, how could you set up ways where you could do better remote monitoring of patients, where they don’t have to come into the office, and they could still alert a clinician when there are indications that a patient might need help, or follow-up?”

Collaborating with Chelsea Chandler, a computer science graduate student at CU Boulder, as well as other colleagues, Foltz and Elvevåg developed an AI system that assesses speech samples and compares them to previous samples by the same patient, as well as the broader population. It then rates the patient’s mental state, according to a news release.

In one recent study, the researchers asked human clinicians to listen to speech samples of 225 participants — half with severe psychiatric issues, the other half of them healthy volunteers — in rural Louisiana and Northern Norway and then assess them. They then compared those results to those of the machine learning system.

What researchers discovered is that the computer’s AI models can be at least as accurate as clinicians, Foltz said.

Foltz said that, being conservative, “I believe there’s a lot to be done to determine how to best get an app that is generalizable to a wide population and also fits in well with clinical practice. And so, I am saying it’s on the order of five or more years” before a clinical use of the app might be ready.

“But I think we’re pointing in the direction,” he added. “ We’re showing we can move in that direction, now”

The Chandler-Foltz-Elvevåg paper in Schizophrenia Bulletin, titled ”Using Machine Learning in Psychiatry: The Need to Establish a Framework that Nurtures Trustworthiness,” emphasizes that more work needs to be done before AI can have a role in everyday psychiatry.

“We are working on setting up some ways to develop, and to evaluate the effectiveness and appropriateness, of applying artificial intelligence in the field of psychiatry,” Foltz said.

©2019 the Daily Camera (Boulder, Colo.), Distributed by Tribune Content Agency, LLC.