Suicide is a significant issue facing the United States. Artificial intelligence (AI) and predictive analytics are promising means of identifying individuals that may be contemplating taking their lives. So what are some current examples of the use of AI and predictive analytics in suicide prevention?
One innovation in this exciting area is the Department of Veterans Affairs’ (VA) Recovery Engagement and Coordination for Health - Veterans Enhanced Treatment (REACH VET) initiative, which the VA says uses a “new ‘predictive model’ to analyze existing Data from Veterans’ health records to identify those at a statistically elevated risk for suicide, hospitalization, illness or other adverse outcomes.”[1] While this does not state that REACH VET is specifically utilizing AI (machine learning data science methodologies), the language used by the VA to describe the innovation suggests it could be leveraging AI methodologies. One of the most challenging aspects of preventing suicide in the Veteran population is sorting out which Veterans are most at risk and it appears REACH VET is a significant innovation that is poised to help address this challenge, at least for Veterans that are receiving services from one or more VA facilities. Suicide prevention is VA Secretary David Shulkin, M.D.’s number one clinical priority and REACH VET underscores the VA’s commitment to solving this problem. The VA says the risk of suicide is 21% greater for Veterans than it is for non-Veteran U.S. adult civilians.[2] In an April 2017 VA press release, the National Director of the VA’s Office for Suicide Prevention, Caitlin Thompson, Ph.D., said “REACH VET is a game changer in our effort to reduce Veteran suicide...Early intervention can lead to better recovery outcomes, lessen the likelihood of challenges becoming crises and reduce the stress that Veterans and their loved ones face.”[1]
The U.S. Army, in partnership with the National Institute of Mental Health (NIMH), has also been a trailblazer in this important area with their Study to Assess Risk and Resilience in Servicemembers or Army STARRS project. The suicide rate in the U.S. Army began rising in the early 2000s and by 2008, it exceeded the demographically-matched civilian rate (20.2 suicide deaths per 100,000 vs. 19.2).[3] It is this uptick in suicide incidence among U.S. Army servicemembers that led to the collaboration between the Army and NIMH.[3] STARRS employs machine learning based methodologies (regression trees and penalized regressions)[4] and due to i) the fact that the U.S. Army has a large and rich data set for its servicemembers; and ii) the fact that each servicemembers’ Army identification number (military ID number) is essentially an Enterprise Master Patient Index (EMPI) identifier that frequently gets captured in non-Army databases, U.S. Army servicemembers end up being a very special context or population of individuals within which to examine the potential merits of data science in predicting risk for suicide. This is because large data sets that include known outcomes — in this case suicides that have occurred among soldiers and, importantly, soldiers who did not commit suicide — are required to ‘train’ and then validate the ‘trained’ models when supervised machine learning approaches are used. Ronald Kessler, Ph.D., STARRS administrative principal investigator and Harvard University Professor, described that, “A single soldier has a single ID number that can be used to link medical data with criminal justice data, human services data, school data and employee data...These data have an enormous richness that we can use to help target soldiers at high suicide risk.”[5] In the 2015 STARRS study published in JAMA Psychiatry, the investigators examined 53,769 active duty soldiers with the objective of predicting their risk of suicide in the 12 months following discharge from an acute inpatient hospital stay. The hospitalization of patients included in the study occurred between 2004 and 2009 and the data used as inputs into the predictive models was administrative data available before hospital discharge, which included abstracted data from a wide range of data systems (sociodemograpic, U.S. Army career, criminal justice and medical or pharmacy).[4] It is reported that the trained STARRS machine learning models were able to successfully predict suicide with a sensitivity and specificity of nearly 70 percent.[6] In other words, its false positive and false negative rates are fairly reasonable, but the prediction, does not equate to certainty. Among the factors analyzed in the 2015 paper, the investigators found that male sex, age at time of enlistment greater than 26 years old, criminal offenses, weapons possession, and prior suicidal ideation were the among the strongest predictors.[4]
While these are only two examples, both certainly lend credence to the idea that artificial intelligence and predictive analytics may indeed prove to be extremely valuable adjuncts to clinical assessment. Given STARRS sensitivity and specificity profile, it may provide value both in screening (identifying who needs clinical assessment) and in confirmatory assessment (providing additional information that may tip the scales one way or the other in cases where the clinical assessment is borderline or equivocal; i.e. rule in or rule out). What is particularly promising and notable about the Army’s STARRS is the investigators have identified and validated that it is possible, simply by using a soldier’s military ID number, to capture and leverage data across the boundaries of multiple unaffiliated entities. In other words they have demonstrated a potentially workable means of overcoming the limitation presented by disparate data domains or so called data silos. That being said, the focus of the analysis, however, was not the feasibility of automating this data collection and deploying the approach on a large scale in a ‘production’ environment. Nonetheless, the approach does open the door to potentially addressing one of the biggest challenges in preventing suicide, namely, how to identify individuals at risk for suicide that have not received healthcare services within a given health care entity, such as Veterans that do not obtain care at the VA. The possibility of an organization like the VA being able to simply use an Army Veteran’s unique military ID identifier to reliably obtain data that may hold troves of predictive value in regards to suicide risk, even though this data originates from a variety of unaffiliated entities, holds great promise.
For some of you, this thought may have crossed your mind as you have been reading along: why don’t the VA REACH VET team members and the U.S. Army STARRS investigators start working together? The exciting thing is, that is actually what is happening. According to an April 2017 article in the San Diego Union Tribune, Dr. Kessler is actually doing some work with the VA on their REACH VET models.[7] With these bright minds working together and bringing their collective wisdom and experience to bear on taking REACH VET to the next level, the future is undoubtedly very exciting!
The next frontier in suicide prevention may well be problem solving around how to scale the approach used by STARRS (analytics that can use data from disparate sources and entities) in combination with the REACH VET analytics. At minimum this would entail figuring out how to operationalize a process that i) pulls together data from disparate sources such as hospitals, criminal justice, social services, schools and employer databases, ii) matches it up to a given individual; iii) determines the individual's risk for suicide; iv) gets the results into the hands of primary care providers, psychiatrists, psychologists, social workers and case managers, such as those at the VA or another healthcare provider organization, that can assess the situation and outreach the individual, if necessary; and v) does these things in real time or near real time. If this can be orchestrated (i.e. deal with the challenges of interoperability, electronic data exchange / healthcare information exchange, data governance, privacy, etc.), it means that someday, the VA, to use one example, may be able to outreach a Veteran that may be contemplating taking their life, even if that Veteran has never stepped foot in a VA facility. This obviously would be a game changer and could prove to be an invaluable methodology that can help healthcare provider organizations like the VA stem the tide and reduce the number of Veterans we lose each day to suicide.