FYI.

This story is over 5 years old.

Health

A Human Therapist's Take on Facebook's Suicide-Prevention Tool

"The issue brought to my mind Ned, a 30-year-old compulsive Facebook user whom I have been seeing for several years. I have become concerned for the first time that he could kill himself."

Last week Facebook announced that it was taking steps to address the spate of suicides that have occurred on its site, especially on its streaming feature, Facebook Live. Apparently, social media platforms are seductive tools for individuals to communicate their suffering and suicidal intentions, or even to broadcast the ghastly acts themselves. Incidents in the last two months over Facebook Live have included a 14-year-old girlin Florida and a 12-year-old-girl in Georgia who hanged themselves and a 33-year-old man in California who shot himself.

Advertisement

The phenomenon is posing important questions for Facebook, as well as for users and their families. Is Facebook a community, like all communities, in which self-destructive and suicidal behavior takes place? Or is there something about this particular community—and perhaps about Facebook Live—that fosters it? What can Facebook do about it and how can it bring safety to the user experience?

In response to the problem, Facebook is expanding suicidal prevention initiatives that have been in place since2016. At the time, it introduced features that allowed users to flag posts that were concerning; that would lead Facebook to reach out to members in distress. Based on posts that were flagged, the company has now developed AI algorithms to identify concerning content. After review by a community operations team, Facebook will suggest that the at-risk user contact a friend, even offering a text introduction. And people can chat directly with various organizations, including the National Suicide Prevention Lifeline. A group of particular focus is users of FB Live.

Continue reading on Tonic.