FYI.

This story is over 5 years old.

Motherboard Homepage

Facebook’s New Algorithm Combs Posts to Identify Potentially Suicidal Users

The site needs to strike a delicate balance between privacy and safety, but experts say it’s the right move.

In January, a 14-year old Miami girl broadcast her suicide on Facebook Live, to the horror of her friends and family. Just three weeks earlier, a 12-year-old in Georgia did the same thing on a site called Live.me, but the video began circulating on Facebook. And there have been multiple suicides that were predicated by a goodbye post on Facebook. While Facebook has long had protocols in place to identify and reach out to potentially suicidal users, it recently upped the ante. This week, Facebook announced beefed up suicide prevention tools that use algorithms to scan posts and look for potentially suicidal language—posts about sadness, pain, and not being able to take it any more, for example—as well as take note of comments on posts that may signal an issue—things like "are you okay?" and "I'm here for you." Those posts are flagged and both the user, and his or her friends, are offered a resources page, with options to message crisis hotlines over Facebook Messenger, and tips for reaching out to friends. It's also integrated these tools into Facebook Live, specifically, for the first time. Read more on Motherboard

Advertisement