FYI.

This story is over 5 years old.

Post Mortem

The Internet is Still Teaching People How to Kill Themselves

Search engine algorithms and laws have made it harder for people to find suicide-related content online, but it's still pretty easy to find explicit instructions on committing suicide from a quick Google search.

Image by Flickr user Mike Licht

On March 23, Germanwings co-pilot Andreas Lubitz safely flew an empty jet from Düsseldorf, Germany, to Berlin. He then went home, where he used his iPad to search terms like "living will," "suffering," and "dying." The next day, Lubitz flew an Airbus 320 with 149 passengers into the French Alps, killing everyone on board.

This was not the first time the 27-year-old Lubitz had used the internet to look up suicide-related content. Information later gleaned by prosecutors showed that while on medical leave from the airline earlier in March, Lubitz had looked up other, more specific ways to kill himself that didn't involve flying a passenger aircraft into a mountain.

Advertisement

Worldwide, online content relating to suicide is not only controversial, but often banned outright. In 2013, Russia started blocking sites that provided explicitly detailed instructions for how to die, as part of a new law ostensibly aimed at protecting children. That same year, the United Kingdom required Internet Service Providers (ISPs) to ban suicide-related content as part of the country's broader "porn filter" legislation. Australia's crackdown began with the Suicide Related Material Offenses Act of 2005, which made it a criminal offense to "directly or indirectly counsel or incite committing or attempting to commit suicide" via the internet—though foreign websites are not blocked under the law. South Korea and Japan also enforce filtering by ISPs.

These laws aim to minimize suicide contagion, a documented phenomenon with very real public health consequences. But legislating the internet is tricky, and the restrictions often lead to harmless sites being blocked. In Russia, YouTube went to court to argue that a blocked video, which showed how to make a fake wound, was not what Russian lawmakers intended. In the UK, the filters have even blocked suicide prevention and other helpline websites.

On Motherboard: The Deep Web Suicide Site

Despite legislative attempts to restrict suicide-related content online, search engines like Google, Bing, or Yahoo don't restrict what keywords someone can enter. Indeed, they generally have functionality that gives additional keywords to help the user make their search more specific. It's not hard to see how this might become problematic if left unchecked. If enough people look up specific methods or considerations for suicide—like ways to avoid pain or detection—then the algorithm recognizes these searches as popular, and will suggest those search terms to people who are typing in more general searches. The consequence is providing ideas about suicide before the user has even clicked on the search results.

Advertisement

In the case of Google, there does appear to be an effort by the company to curb this effect for suicide related keywords. According to their site, the autocomplete feature (which can't be turned off) excludes a "small set of search terms" that would otherwise appear as search predictions. There is also a link to report "offensive predictions." A 2014 version of the same page explicitly mentions suicide, along with more well-known examples like pornography and hate speech. When I tried Googling common keywords relating to suicide methods, the autocomplete predictions didn't show up. But once I hit "enter" on my suicide-related search term, I found that the autocomplete predictions did appear when I left my cursor in the search bar.

Microsoft's Bing—which also powers Yahoo's search—was much less restrictive. The company lists what type of content it restricts on its search results, but its page on search suggestions (similar to Google's autocomplete) doesn't mention anything being excluded. So when I tried the same searches I had typed on Google, Bing brought up specific methods or considerations as search suggestions, indicating that no serious effort had been made to remove them. On the other hand, Yahoo's search engine seemed to exclude most additional keywords related to suicide methods on the same searches, with the exception of a few news headlines.

We devoted a week's worth of content to stories on mental health. Read the VICE Guide to Mental Health here.

Advertisement

What about the actual search results that users find when asking about suicide methods? As with any other keyword, these are partly determined by the search engine's algorithms for relevance as well as the user's search history (so not everyone sees identical results). Search engines don't interfere directly with what sites show up in search results for specific keywords (though they do make regular updates to their underlying algorithms, which can influence what pops up). Instead, with suicide-related terms—and others terms, like "depression help"—they prominently place a PSA in the form of a large box above the search results.

It features the phone number and website for the toll free 24-hour National Suicide Prevention Lifeline, which puts people in touch with a trained counselor that can help them find crisis prevention in their area. In addition to the toll free number, the site also features an online chat option. The Substance Abuse and Mental Health Services Administration (SAMHSA), which runs the toll-free service, fields about 2,200 calls per day as of 2011. By comparison, approximately 3,000 suicide attempts occur daily. Lifeline claims to have conducted 16,000 life-saving rescues for veterans alone, about 20 percent of its callers.

If call volume is any indicator, the little grey box on search engines has been impactful. According to a company blog post, the National Suicide Prevention Lifeline saw a 9 percent increase in calls placed after Google implemented this in April 2010. Similar helpline PSA's are now in place on local versions of Google in an additional 21 countries. The company advises partners in other countries that they should expect an increase of 10 percent when implementing this feature.

Advertisement

The placement of the Lifeline box could probably be expanded to include more queries—with more refined searches, neither Google nor Bing consistently show the Lifeline box. According to a company representative, who I spoke to via email, Google has an algorithm to determine the terms that bring up the Lifeline. They also work with partners on terminology such as specific techniques particular to a country or culture.

While more specific searches are generally used significantly less often than broader ones, the results for them are often considerably more likely to be harmful. A 2014 study by Benedikt Till for the Journal of Clinical Psychiatry examined the search results on Google and Yahoo/Bing for suicide related queries in three categories: basic, method-related, and help-related. They then tabulated protective characteristics (like providing alternatives) as well as harmful characteristics (like details of a suicide method). Overall, Till found that protective characteristics on websites outnumbered harmful characteristics by two to one.

However, this varied significantly depending on the type of search. When the search was basic, the ratio was seven to one, and for help-related searches, the ratio was as high as 20 to one (in the US versions of the search engines). But for method-related specific searches (e.g. "the best way to…"), the harmful sometimes outnumbered the protective, with one method having a protective-to-harmful ratio of one to four. Till recommends that websites with protective characteristics make a more concerted effort to show up higher for these method related searches by improving their social media presence and even purchasing ads.

Advertisement

The right to die: How "suicide tourism" in Switzerland doubled within four years.

Similar to before, when I entered this particular method-related query into Google, the Lifeline number popped up. But when the autocomplete options with more specificity were chosen, it didn't show up. On Bing and Yahoo, the searches yielded many more autocomplete suggestions than on Google, and the Lifeline number didn't show up at all. (I'm leaving out the specific terms here so as not to provide a "how-to" on searching suicide methods.)

Given that precise data on internet searches is a closely guarded secret, it's hard to know exactly how often a specific keyword is typed into a search engine. Google provides the relative popularity of two or more terms through its Google Trends service to the general public, but absolute figures are not included. The only publicly-available data set to provide access to raw search logs is from AOL, who decided to allow the mostly-anonymized logs of 657,000 of its users to be available for download from its site for research purposes. Paul Wai-Ching Wong's 2013 study in the Journal of Internet Medical Research examined the logs and found that "users generally accessed webpages in the search results that provided entertainment, scientific information, news, and resource information." They also found a small subset of searches related to suicide, about 1 to 2 percent of which were about specific methods of killing themselves.

Rather than censorship, Wong recommends more attention be given to the study of actual online behavior of vulnerable individuals. He cited research conducted by Lucy Biddle and published in the Journal of Affective Disorders, which interviewed 22 individuals who had previously attempted suicide between 2004 and 2009 but were no longer suicidal. Of these, five consciously used the internet for research. One person used a search engine to find the "best" method after previous failed attempts with different methods—though she ended up trying a different way than what she researched online. Others sought ways to try again using the same method they had used in the past. Suicide-specific sites were cited less frequently than general interest sites like Wikipedia or news sites.

This kind of thoughtful examination of individual users' browsing history will be essential in developing more effective harm minimization approaches, which as of right now, still have a long way to go.

If you are feeling suicidal, visit the website of the National Suicide Prevention Helpline or call toll free 1-800-273-TALK (8255) at any time. A listing of similar helplines in other countries can be found here.

Follow Simon Davis on Twitter.