Facebook has begun using artificial intelligence to identify members that may be at risk of killing themselves.
The social network has developed algorithms that spot warning signs in users’ posts and the comments their friends leave in response.
After confirmation by Facebook’s human review team, the company contacts those thought to be at risk of self-harm to suggest ways they can seek help.
A suicide helpline chief said the move was “not just helpful but critical”. The tool is being tested only in the US at present.
It marks the first use of AI technology to review messages on the network. Founder Mark Zuckerberg announced last month that he also hoped to use algorithms to identify posts by terrorists among other concerning content.
Facebook also announced new ways to tackle suicidal behaviour on its Facebook Live broadcast tool and has partnered with several US mental health organisations to let vulnerable users contact them via its Messenger platform.
Pattern recognition
Facebook has offered advice to users thought to be at risk of suicide for years, but until now it had relied on other users to bring the matter to its attention by clicking on a post’s report button.
It has now developed pattern-recognition algorithms to recognise if someone is struggling, by training them with examples of the posts that have previously been flagged.
Talk of sadness and pain, for example, would be one signal. Responses from friends with phrases such as “Are you OK?” or “I’m worried about you,” would be another.
Once a post has been identified, it is sent for rapid review to the network’s community operations team.
“We know that speed is critical when things are urgent,” Facebook product manager Vanessa Callison-Burch told the BBC.
The director of the US National Suicide Prevention Lifeline praised the effort, but said he hoped Facebook would eventually do more than give advice, by also contacting those that could help.
“It’s something that we have been discussing with Facebook,” said Dr John Draper.
“The more we can mobilise the support network of an individual in distress to help them, the more likely they are to get help.
“The question is how we can do that in a way that doesn’t feel invasive. “I would say though that what they are now offering is a huge step forward.” Ms Callison-Burch acknowledged that contact from friends or family was typically more effective than a message from Facebook, but added that it would not always be appropriate for it to inform them.
“We’re sensitive to privacy and I think we don’t always know the personal dynamics between people and their friends in that way, so we’re trying to do something that offers support and options,” she said.
The latest effort to help Facebook Live users follows the death of a 14-year-old-girl in Miami, who livestreamed her suicide on the platform in January.
However, the company said it had already begun work on its new tools before the tragedy.
The goal is to help at-risk users while they are broadcasting, rather than wait until their completed video has been reviewed some time later. Now, when someone watching the stream clicks a menu option to declare they are concerned, Facebook displays advice to the viewer about ways they can support the broadcaster.
The stream is also flagged for immediate review by Facebook’s own team, who then overlay a message with their own suggestions if appropriate.
“Some might say we should cut off the stream of the video the moment there is a hint of somebody talking about suicide,” said Jennifer Guadagno, Facebook’s lead researcher on the project.
“But what the experts emphasised was that cutting off the stream too early would remove the opportunity for people to reach out and offer support. “So, this opens up the ability for friends and family to reach out to a person in distress at the time they may really need it the most.” The new system is being rolled out worldwide.
A new option to contact a choice of crisis counsellor helplines via Facebook’s Messenger tool, however, is limited to the US for now. Facebook said it needed to check whether other organisations would be able to cope with demand before it expanded the facility.
“Their ongoing and future efforts give me great hope for saving more lives globally from the tragedy of suicide,” said Dr Dan Reiden executive director of Save.org, which is involved in the initiative.
“The opportunity for prevention, even with Facebook Live, is better now than ever before.”
Join GhanaStar.com to receive daily email alerts of breaking news in Ghana. GhanaStar.com is your source for all Ghana News. Get the latest Ghana news, breaking news, sports, politics, entertainment and more about Ghana, Africa and beyond.