FYI.

This story is over 5 years old.

Tech

Censoring Self-Harm on Facebook Might Do More Harm Than Good

Social media companies are grappling with how to identify and censor other self-harm.
shutterstock_1028356621
Image: Shutterstock

This article covers self-harm and includes images of self-injury scars.

Mason Marks is a research scholar at the Information Law Institute at NYU and a visiting fellow at the Information Society Project at Yale Law School. Find him on twitter @MasonMarksMD.

For years, social media companies have used artificial intelligence to predict and prevent suicide. Now they are grappling with how to identify and censor other types of self-harm such as cutting, which is part of a condition called non-suicidal self-injury disorder (NSSID).

Advertisement

On February 7, Facebook and Instagram banned graphic images of self-injury, and they promised to censor images of healed or fading scars by removing them from search results. The father of British teen Molly Russell prompted these changes in January when he claimed Instagram contributed to his daughter’s 2017 suicide by allowing her to view graphic images of self-harm. His allegations made international news and drew the attention of British officials who threatened social media platforms with regulation. In response, and out of concern that children like Molly might be influenced to imitate self-injury, Facebook and Instagram issued their new self-harm policies.

Some research suggests that censoring images of active self-harm may be warranted, but the evidence is not conclusive. Moreover, experts interviewed for this article agreed that censoring healed scars is likely an excessive measure, and it may cause harm. Though well-intentioned, such policies could exacerbate the stigma felt by people with NSSID and contribute to their social isolation. Even worse, it may reinforce the very emotions, such as shame, that might drive people to harm themselves, according to Seth Axelrod, a psychiatry professor at Yale who specializes in NSSID and related disorders. Finally, by using AI to censor search results and blur the scars of people with NSSID, Facebook and Instagram are practicing algorithmic disability discrimination, a process I describe in one chapter of a forthcoming book on redefining disability for the 21st century. Algorithmic disability discrimination, such as the systematic censorship of people with NSSID, could violate Title III of the Americans with Disabilities Act (ADA).

Advertisement

Non-suicidal self-injury is surprisingly common and mostly affects teens and young adults. US lifetime prevalence is about 15–20 percent, which means up to one in five Americans may engage in self-harm at some point in their lives. According to Axelrod, people often engage in self-harm to cope with emotions and life experiences that feel unbearable. Though maladaptive, the behaviors can temporarily dull emotional pain. Despite its high prevalence, NSSID has only provisional status in the DSM-5, the standard diagnostic manual of psychiatry. Nevertheless, the condition can be severely disabling.

Because NSSID is so heavily stigmatized by society, “for many individuals, the internet serves as a preferred medium to communicate about self-injury,” Stephen Lewis, a psychology professor at Guelph University, told me in an email interview. Lewis is the President of the International Society for the Study of Self-Injury (ISSS), and his research focuses on NSSID and how people discuss it online. “Banning imagery related to scars (or related hashtags) may take away from a needed means to share one’s story (perhaps how one has overcome self-injury) and thus further perpetuate stigma and marginalization,” he said.

Dr. Axelrod agreed. He runs a center that treats people with NSSID through group therapy. “Some participants have their whole arms covered with healed scars, and at least in our practice, we don’t ask people to cover up their scars and ask group members to tolerate that and receive such people who are in recovery,” he said. In therapy, Axelrod encourages people to speak openly and proudly about their experiences. He says posting images online may be an effective way to do it.

Advertisement

In February, a Twitter user named Sabian Join (@SabianK23) posted an image of her healed scars next to a tattoo she received that covers them. The images were accompanied by the inspirational statement “#selfharm to #selflove.” Under Facebook and Instagram’s new policy, the images could be posted, but they would be blurred by a content warning and filtered out of search results and Instagram’s Explore feature, said a Facebook spokesperson in an email to Motherboard.

“I believe that people who have or do self-harm should not be censored by media, because posts like mine help show that there is a problem, that people do need help, and for me to share a picture like that took a lot of stress and even self-doubt away,” Join told me in a Twitter direct message. “It made me able to reach out to others like me even if it was maybe one or two people.”

1551390006953-image1

“I would see that [Join's post] as potentially very empowering both for the person showing it and for others who might be inspired by it,” Axelrod said. He emphasized that people with NSSID often experience shame as a prominent emotion, and censoring images of healed or fading scars could “either create shame or at the very least not be consistent with people getting past the shame.”

Regarding images of healed scars on Instagram, he said “these images can’t be searched, these images get buried, and that is a rejection that could validly be interpreted as being ostracized . . . theoretically putting people at greater risk of having trouble controlling these behaviors.”

Advertisement

Marissa Barber, a mental health advocate from Ontario, Canada who blogs openly about self-harm, was disturbed by Instagram’s policy. Barber says she enjoys seeing a woman in public displaying her healed scars because she finds it empowering. “I am covered in scars, so seeing other people’s scars in a really pretty dress, it’s really nice to see. So I do think that has a place online,” she said by phone.

Barber suggested that sensitivity screens that blur images of scars, and function as content warnings by allowing people to click past the screens, might be an appropriate compromise if the blurred images remained searchable and hashtaggable. Instagram introduced sensitivity screens in 2017 to mask content that people might find objectionable. Earlier this month, Instagram’s Adam Mosseri told the Telegraph his platform would apply sensitivity screens to all images of cutting while still allowing the images to be posted. However, a few days later, Mosseri announced a modified policy in which images of active self-harm "such as cutting" would be banned, and sensitivity screens might be applied to non-graphic images of self harm "such as healed scars." A Facebook spokesperson confirmed to Motherboard by email that sensitivity screens will be applied to images of healed scars on Facebook and Instagram.

Though blurring images may have superficial appeal, Barber is concerned that some people might be perpetually blurred online if they have numerous visible scars. Oftentimes the scars’ appearance may be incidental to what the poster is trying to portray. As an example, she wondered whether an image showing off a new bracelet would be censored if it also showed scars on the user’s wrist and forearm. “That’s like blocking out and silencing a whole human being and I hate that,” she said. There are also concerns that censoring self-harm may affect other populations such as injured veterans, people with burn injuries, and people recovering from surgery.

Advertisement

"I am aware of a support group of people who show scars from surgery," said Axelrod. "It’s a support group, and they commonly show pictures of their scars." Facebook and instagram's policy could inadvertently censor these people in addition to those with NSSID.

Dr. Lori Hilt, a psychology professor at Lawrence University, studies and has written extensively about self-injury. In a phone interview Hilt said “I wouldn’t want to censor the image of the healed scar with a tattoo. I could see negative effects of increasing stigma by trying to erase and deny the lived experiences of those who have recovered.”

Instagram’s policy sets a dangerous precedent, and squeamishness should not be a valid excuse for discrimination. By systematically removing even healed and fading scars from searches, hashtags, and promotions, Instagram and Facebook are censoring people with a recognized medical condition, which may qualify as a federally protected disability. Imagine what would happen if the policy was applied to other groups such as people with motor disabilities. There would be near universal outrage, and rightfully so. However, there has been little or no mainstream outrage in response to Instagram and Facebook’s new policies.

People use Instagram to express and promote themselves. Some users maintain accounts that feature their disabilities. For instance, Kaitlyn Dobrow is a fashion blogger who also happens to be a quadruple amputee. Marimar Quiroa is a beauty blogger who happens to have a rare facial tumor. These bloggers make valuable contributions to the Instagram community. It would be unjust and discriminatory if either was barred from search results based on a disability. Systematically censoring the images of people with NSSID from searches bars them from benefiting fully from Instagram’s services; it prevents them from promoting themselves, gaining and retaining followers, and potentially even from making an income if they choose to monetize their posts.

Advertisement

Title III of the Americans with Disabilities Act (ADA) prohibits discrimination against people with disabilities in places of public accommodation, which include private businesses that are generally open to the public. Title III lists twelve categories of businesses that provide public accommodations such as hotels, restaurants, and movie theaters. As businesses that are generally open to the public, Instagram and Facebook should be categorized as places of public accommodation under the ADA. However, Title III does not mention online businesses because it was adopted in 1990 when the internet was in its infancy. As a result, under Title III, places of public accommodation were typically brick-and-mortar business, and few online businesses fit that mold.

Federal courts have been divided on the issue of whether internet businesses are covered by Title III. In National Federation of the Blind v. Scribd Inc., an internet-based document repository was deemed a place of public accommodation. The Vermont court reasoned it wouldn’t make sense for people to be treated differently by the law if they bought an insurance plan from a brick-and-mortar store or bought a similar policy from an online vendor. However, in Cullen v. Netflix , a Northern California court disagreed, finding that because Netflix’s streaming services are not connected to any “actual, physical place,” Netflix is not subject to the ADA. In National Association of the Deaf v. Netflix, a Massachusetts court reached the opposite conclusion . It said Congress did not intend to limit places of public accommodation to the examples listed in Title III, and excluding companies that deliver services online from the definition would frustrate Congress’s intent that people with disabilities “fully enjoy the goods, services, privileges and advantages, available indiscriminately to other members of the general public.”

The Massachusetts court’s argument is compelling. When Congress adopted the ADA, it could not have envisioned how dependent society would become on internet platforms for commercial and personal communication. Including internet business within Title III’s scope, and granting people with disabilities equal access, is consistent with the spirit of the ADA. Facebook and Instagram’s new policy is a good example of why Congress should amend Title III to clearly include internet businesses. Barring an amendment of the ADA, the issue could be resolved through further litigation, potentially at the US Supreme Court.

When asked whether Facebook’s new policy might amount to disability discrimination, Facebook's spokesperson said “Self-harm is not a mental illness and people who self-harm do not necessarily have a mental illness.” To be fair, NSSID is defined as a "condition needing further study" by the DSM-5. That means its diagnostic criteria are subject to change, and further research is needed to support its inclusion as an official diagnosis.

Nevertheless, NSSID's status as a distinct disorder is supported by clinical research. Regardless, the ADA does not require people to have a DSM-5 recognized mental illness to qualify as disabled. Instead, the law defines disability more functionally as “a physical or mental impairment that substantially limits one or more major life activities." Moreover, the World Health Organization defines disability broadly as "an umbrella term, covering impairments, activity limitations, and participation restrictions." According to the WHO, "disability is thus not just a health problem." Instead, "it is a complex phenomenon, reflecting the interaction between features of a person’s body and features of the society in which he or she lives." Through its self-harm policies, Facebook is shaping the contours of society in ways that may reinforce and contribute to people's' impairments and degree of disability.

Topics such as suicide and self-harm can be difficult to talk about, and it may be more comfortable to sweep them under the rug and pretend they don’t exist. But relegating these conditions to the far reaches of the internet is unlikely to help people and will likely make things worse. We must talk openly about these conditions because they affect millions of Americans who deserve to be treated fairly and given equal opportunity to participate in all aspects of society, including social media.

If you are in need of help with a mental health issue, you can visit SAMHSA’s National Helpline’s official website for treatment information or call 1-800-662-HELP (4357) for confidential help.