Social media feeds have been awash with impassioned suicide-prevention messages recently, as several high profile cases have drawn fresh attention to the issue.
Suicide remains one of the leading causes of death globally, claiming 800,000 lives a year. In Singapore it’s the leading cause of death amongst 10 – 29 year olds. Yet success in reducing it has made little progress, compared to other preventable causes of death. Online dialogue has succeeded in getting people talking about this stigmatised subject, but also encourages us to consider the potential role of social and digital technology in suicide prevention.
Opportunities to collect data about peoples’ behaviours online and offline are providing new insights into those at risk, with machine-learning algorithms to predict suicides producing powerful results. A programme by the US Army looked at 40,000 soldiers, previously hospitalised on psychiatric grounds. The study found that more than 50% of the group that subsequently committed suicide were in the top 5% of those predicted by an algorithm.
Elsewhere, a machine-learning algorithm tested by Vanderbilt University Medical Centre achieved 92% success rate in anticipating patients who would attempt to take their life within the following week.
The final 10 minutes
According to the American Association for Suicide Prevention, there are approximately 25 failed attempts behind every suicide. Research further suggests that people are actively suicidal for only one hour, and that half of those who commit suicide take the decision during the final 10 minutes. Could digital intervention in those final minutes be the difference between life and death? How could geolocation mapping data, drawn from telecom data, play a role in supporting this?
Mobile devices and wearables offer increasing opportunities to predict and engage those with suicidal thoughts. Facial and speech recognition, for example, provide valuable insights to changes in behaviour that may present warning signs. Our smart phones are already equipped to carry out some form of digital mental health assessment. But should we step in and intervene when we see someone at risk?
Privacy and data protection remain challenges to address in this field. While social media players, such as Facebook, may be better able to predict suicide than close friends based on online behaviours, should they act on their insights? Or should help only be offered if consent has been obtained in advance?
Virtual reality support
One of the reasons people seek privacy in this area relates to one of the biggest challenges for suicide prevention, which is the taboo associated with mental illness and depression. Digital engagement provides opportunities for people to seek support anonymously. While this provides some level of positive engagement, it is unlikely to replace a psychiatrist or supportive friend. As with many uses of technology in health, it should be considered as a tool to augment human care rather than replacing it.
That said, many encouraging examples exist and investment in the digital mental health space is growing year-on-year, believes Craig DeLarge, Founder of the Digital Mental Health Project. He has been exploring the opportunities for digital technologies to support mental health for some time. “Going back more than a decade, virtual and augmented reality have been used to provide support, and battle stigma, in the space of schizophrenic hallucinations and anxiety. Virtual reality (VR) games, using the right approach, can be facilitators of mental wellness and support prevention,” he says.
“Virtual reality games, using the right approach, can be facilitators of mental wellness and support prevention.”
One of the most remarkable anonymous uses of VR DeLarge has seen is to conduct mental health support groups in the virtual world. The platform called SecondLife, allows patients to “anonymously show up as their avatar and engage others from around the world in discussions that are supportive to mental health”, he explains.
In Singapore, Dr Matthew Chua, from Smart Health Leadership Centre at the Institute of Systems Science, and Dr Joseph Leong, from the Institute of Mental Health, are collaborating to develop a VR therapist bot for mental health patients.
Dr Chua explains the motivation for the project was to engage suicidal patients suffering from depression. “Appointment adherence for their weekly therapy sessions is low. Patients typically don’t feel like leaving home, and the cost of therapy adds to their reluctance. This may cause them to spiral deeper into depression.”
The VR therapist bot brings treatment into the patient’s home, via mobile devices synced to the hospital’s system. “Patients are ‘transported’ to a scenic landscape, where they meet their therapist avatar-bot, which is trained with counselling skills and highlights key words from the conversation to the therapist monitoring the session. It encourages the patient to communicate, while the therapist can monitor several patients remotely and more frequently”, says Dr Chua about the project, which is in early stages of development.
“Patients are ‘transported’ to a scenic landscape, where they meet their therapist avatar-bot”.
What are the risks?
Unfortunately, whilst the number of mental health apps is growing, their impacts are not yet convincing. The quality of content and engagement is variable and unregulated. Some mental health digital applications have even been found to suggest potentially damaging advice.
In some cases, patients have resisted adoption because they are not aware of the clinical research, “owing to a lack of credibility, and a heightened – real and perceived – risk of harm”, DeLarge says. Given the stigma associated with mental illness, these privacy concerns are valid, especially given the relatively low vigilance exercised in this area presently, he adds.
Progress to date suggests analytics and digital technologies can predict suicide with greater accuracy than humans alone. We must now focus on how to get traction in leveraging new technologies to prevent this stubbornly significant cause of death.
Tamsin Greulich-Smith is Chief of the Smart Health Leadership Centre at the National University of Singapore’s Institute of Systems Science.