As suicide rates spike, new AI platform could ‘fill the gap’ in mental health care, say Boston researchers

After a two-year decline, U.S. suicide rates spiked once more in 2021, in keeping with a new report from the Centers for Disease Control and Prevention (CDC).

Suicide is now the eleventh main reason for dying in the nation — and the second amongst folks between 10 and 35 years of age and fifth amongst these aged 35 to 54, per the report. 

As the want for mental health care escalates, the U.S. is fighting a scarcity of suppliers. To assist fill this hole, some medical know-how firms have turned to synthetic intelligence as a way of probably making suppliers’ jobs simpler and affected person care extra accessible. 


Yet there are caveats linked to this. Read on. 

The state of mental health care

Over 160 million folks at present reside in “mental health professional shortage areas,” in keeping with the Health Resources and Services Administration (HRSA), an company of the U.S. Department of Health and Human Services.  

By 2024, it’s anticipated that the whole variety of psychiatrists will attain a new low, with a projected scarcity of between 14,280 and 31,091 people. 

“Lack of funding from the government, a shortage of providers, and ongoing stigma regarding mental health treatment are some of the biggest barriers,” Dr. Meghan Marcum, chief psychologist at AMFM Healthcare in Orange County, California, informed Fox News Digital. 

Some medical tech firms have turned to synthetic intelligence as a way of bettering suppliers’ jobs and making affected person care extra accessible. (iStock)

“Wait lists for therapy can be long, and some individuals need specialized services like addiction or eating disorder treatment, making it hard to know where to start when it comes to finding the right provider,” Marcum additionally stated. 

Elevating mental health care with AI

A Boston, Massachusetts medical information firm referred to as OM1 lately constructed an AI-based platform, referred to as PHenOM, for physicians. 

The software pulls information from over 9,000 clinicians working in 2,500 areas throughout all 50 states, in keeping with Dr. Carl Marci, chief psychiatrist and managing director of mental health and neuroscience at OM1.

Over 160 million folks reside in “mental health professional shortage areas.”

Physicians can use that information to trace developments in melancholy, anxiousness, suicidal tendencies and different mental health issues, the physician stated.

“Part of the reason we’re having this mental health crisis is that we haven’t been able to bring new tools, technologies and treatments to the bedside as quickly as we’d like,” stated Dr. Marci, who has additionally been working a small medical apply via Mass General Brigham in Boston for 20 years.

Eventually, synthetic intelligence could assist sufferers get the care they want quicker and extra effectively, he stated.

Can AI assist scale back suicide danger?

OM1’s AI mannequin analyzes hundreds of affected person data and makes use of “sophisticated medical language models” to determine which people have expressed suicidal tendencies or really tried suicide, Dr. Marci stated. 

“We can look at all of our data and begin to build models to predict who is at risk for suicidal ideation,” he stated. “One approach would be to look for particular outcomes — in this case, suicide — and see if we can use AI to do a better job of identifying patients at risk and then directing care to them.”

In the conventional mental health care mannequin, a affected person sees a psychiatrist for melancholy, anxiousness, PTSD, insomnia or one other dysfunction. 

The physician then makes a therapy suggestion primarily based solely on his or her personal expertise and what the affected person says, Dr. Marci stated. 


“Soon, I’ll be able to put some information from the chart into a dashboard, which will then generate three ideas that are more likely to be more successful for depression, anxiety or insomnia than my best guess,” he informed Fox News Digital.

“The computer will be able to compare those parameters that I put into the system for the patient … against 100,000 similar patients.”

In seconds, the physician would be capable of entry data to make use of as a decision-making software to enhance affected person outcomes, he stated. 

‘Filling the gap’ in mental health care

When sufferers are in the mental health system for a lot of months or years, it’s essential for medical doctors to have the ability to observe how their illness is progressing — which the actual world doesn’t all the time seize, Dr. Marci famous.

Man with doctor

Doctors want to have the ability to observe how the sufferers’ illness is progressing — which the actual world doesn’t all the time seize, stated Dr. Marci of Boston.  (iStock)

“The ability to use computers, AI and data science to do a clinical assessment of the chart without the patient answering any questions or the clinician being burdened fills in a lot of gaps,” he informed Fox News Digital.

“We can then begin to apply other models to look and see who’s responding to treatment, what types of treatment they’re responding to and whether they’re getting the care they need,” he added.

Benefits and dangers of ChatGPT in mental health care

With the growing mental health challenges and the widespread scarcity of mental health suppliers, Dr. Marci stated he believes that medical doctors will begin utilizing ChatGPT — the AI-based massive language mannequin that OpenAI launched in 2022 — as a “large language model therapist,” permitting medical doctors to work together with sufferers in a “clinically meaningful way.”

Potentially, fashions akin to ChatGPT could function an “off-hours” useful resource for many who need assistance in the center of the evening or on a weekend once they can’t get to the physician’s workplace — “because mental health doesn’t take a break,” Dr. Marci stated.

These fashions usually are not with out dangers, the physician admitted. 

“The opportunity to have continuous care where the patient lives, rather than having to come into an office or get on a Zoom, that is supported by sophisticated models that actually have proven therapeutic value … [is] important,” he additionally stated. 

But these fashions, that are constructed on each good data and misinformation, usually are not with out dangers, the physician admitted.

Sad girl texting

With the growing mental health challenges in the nation and the widespread scarcity of mental health suppliers, some folks consider medical doctors will begin utilizing ChatGPT to work together with sufferers to “fill gaps.” (iStock)

“The most obvious risk is for [these models] to give literally deadly advice … and that would be disastrous,” he stated.

To reduce these dangers, the fashions would wish to filter out misinformation or add some checks on the information to take away any doubtlessly dangerous recommendation, stated Dr. Marci.

Other suppliers see potential however urge warning

Dr. Cameron Caswell, an adolescent psychiatrist in Washington, D.C., has seen firsthand the battle suppliers face in maintaining with the rising want for mental health care.

“I’ve talked to people who have been wait-listed for months, can’t find anyone that accepts their insurance or aren’t able to connect with a professional that meets their specific needs,” she informed Fox News Digital. 


“They want help, but can’t seem to get it. This only adds to their feelings of hopelessness and despair.”

Even so, Dr. Caswell is skeptical that AI is the reply.

“Programs like ChatGPT are phenomenal at providing information, research, strategies and tools, which can be useful in a pinch,” she stated. 

“However, technology doesn’t provide what people need the most: empathy and human connection.”

Doctor on tablet

Physicians can use information from AI to trace developments in melancholy, anxiousness and different mental health issues, stated Dr. Carl Marci from medical tech firm OM1. But one other skilled stated, “Technology doesn’t provide what people need the most: empathy and human connection.” (iStock)

“While AI can provide positive reminders and prompt calming techniques, I worry that if it’s used to self-diagnose, it will lead to misdiagnosing, mislabeling and mistreating behaviors,” she continued. 

“This is likely to exacerbate problems, not remediate them.”


Dr. Marcum of Orange County, California, stated he sees AI as being a useful software between classes — or as a technique to provide training a few prognosis.

“It may also help clinicians with documentation or report writing, which can potentially help free up time to serve more clients throughout the week,” she informed Fox News Digital.


There are ongoing moral issues, nevertheless — together with privateness, safety of information and accountability, which nonetheless must be developed additional, she stated. 

“I think we will definitely see a trend toward the use of AI in treating mental health,” stated Dr. Marcum.

“But the exact landscape for how it will shape the field has yet to be determined.”

Source link

Related Articles

Back to top button