Paul Yip says those at risk of suicide often leave clues of their intentions on social media, so the use of AI to track patterns in their language has the potential to save lives, and more can be achieved by sharing results
Facebook is using artificial intelligence to raise awareness of suicide risk through early detection and intervention. Young people today often experience alienation and isolation, especially those spending too much time on social media. Studies have found that, the more time they spend on social media, the more likely they are to report mental health problems.
But rather than blame young people for spending too much time online, perhaps we should ask them whether there are better alternatives, or whether we have created an environment that means they prefer virtual reality to the real world. When they need help, they choose not to seek out friends and family but, rather, express distress on social media. One recent study found that about 15 per cent of students who died by suicide posted messages containing suicidal thoughts on social media.
Unfortunately, sometimes, they go ignored or undetected. Also, it is difficult to obtain timely data on suicide, so our centre is working with stakeholders including community members, service agencies, government departments, technology companies and social media platforms to develop an early warning system to monitor suicidal posts and devise strategies to help identify and reach at-risk youth. One of our latest initiatives is a 24-hour, text-based and online emotional support platform for youth funded by the Hong Kong Jockey Club Charities Trust.
We advocate the use of online data in suicide prevention. Social media performs an indispensable role in this cause, and AI can as well. We have explored the application of machine-learning methods and developed new models to detect and classify blogs and online forum posts presenting suicide risks. It takes time to develop a responsive system, and linguistic differences add to the challenges, especially in a setting with the mixed use of Chinese and English in messages, so it is crucial to establish and maintain close partnerships with local service agencies, frontline social workers and counsellors who can provide insights and feedback in the testing and validation processes.
We have identified some patterns in the use of language from death notes. The results of our preliminary studies are promising and demonstrate the potential for using AI to detect suicide risk. Globally, nearly 800,000 people kill themselves every year. Suicide is the leading cause of death for youth in Hong Kong and the second-highest cause of death globally. More than 60 per cent of deaths occurred in Asia but less than 10 per cent of research resources are spent in this region.
The recent spike in youth suicides in Hong Kong has attracted widespread attention and increased community and government commitments to prevent more such deaths. With concerted efforts from schools and communities, promising changes and improvements have been made and these measures need to be sustained in the community.
The opportunity for intervention is huge. In this social-media-driven era, the Facebook initiative and those by other stakeholders are crucial. We would therefore like to appeal to Facebook to share the results of its suicide prevention efforts. A closer partnership can generate and identify better ideas to improve the accuracy and precision of suicide-risk identification and intervention.
We believe social media is a double-edged sword; a good servant but a bad master. Given the vulnerability of at-risk youth, it is essential that we address suicide prevention in a sensitive manner and work collaboratively to achieve the best possible outcome.
Paul Yip is director of the Centre for Suicide Research and Prevention at the University of Hong Kong