When Facebook and Instagram Think You're Depressed
Image by Lia Kantrowitz for VICE.

FYI.

This story is over 5 years old.

depression

When Facebook and Instagram Think You're Depressed

Ads for wellness apps and bipolar treatments start to appear in your feed.
Lia Kantrowitz
illustrated by Lia Kantrowitz

In spring 2016, I had a depressive spiral that drove me just short of a mental breakdown. Today—more than a year, three doctors, two antidepressants, and one therapist later—Facebook won't let me forget it.

Mental illness has been a part of my life as long as I can remember—so long, in fact, that I thought it was normal. I had my first depressive episode in high school and initially got help for it in college. Since then, I have been diagnosed with Major Depressive Disorder, Generalized Anxiety Disorder, and Premenstrual Dysphoric Disorder (with a few misdiagnoses along the way), but for years I didn't fully believe in any of these illnesses. After briefly trying Zoloft and not seeing a huge improvement, I decided the mood swings and underlying general feeling of deep sadness were just parts of my personality I should embrace.

Advertisement

I went "natural" my senior year of college: off my SSRIs, birth control, and sleeping pills completely. I took a dive into what I would now liken to slowly drowning without realizing it. Since I never really felt an improvement on Zoloft, waking up to a dull feeling of wanting to die was essentially my neutral state—but that doesn't mean it was normal, or sustainable. I finally hit a wall in March 2016 when professional struggles—combined with my usual seasonal depression, a major injury, and a few personal relationship issues—finally pulled me under. I woke up sobbing every day for weeks. My body ached, and I couldn't leave bed. I had panic attacks when I left the house and felt like strangers were following me.

In short, I felt truly crazy. The paranoia and emptiness finally got to be too much, and I decided to seek help, making a call to a therapist to ask about sliding scale prices for sessions, my voice shaking. It was around then that I started getting my first sponsored ads on Facebook and Instagram for anxiety.

The ads started out innocuously enough: an app for meditation that can also help social anxiety, likely targeting many people in my demographic and age group. But as I continued to seek treatment, the ads got more specific—and more intense, often reinforcing my own confusion about my conditions as I worked through accepting them myself. At the urging of my therapist, who said she rarely recommends medication but felt I needed it, I finally saw a doctor. Having had negative experiences with Zoloft, I requested a different kind of antidepressant, Wellbutrin, and got on a prescription. Coupled with therapy and the nice weather of the summer, the meds started working (or so I thought). I saw mild improvements, felt more in control, started exercising regularly, and got a new job. Still, the depression and anxiety ads persisted, usually at a rate of one or two a week.

Advertisement

Facebook draws on information—including places you check into on Facebook, pages you and your friends like, browsing history, and information compiled by data companies like purchases at retail stores—to make ads "useful and relevant to you." This week, after the Australian reported the company was targeting depressed teens after a leaked 23-page document said the social network can determine "moments when young people need a confidence boost," Facebook released a statement explaining that, while it does monitor the emotional state of users, it does not allow advertisers to target them based on that information. As a journalist who researches everything from internet policy to the stock market and Kylie Jenner, it's hard to say exactly why I see the ads I see. Around the time I started being served these ads, I interviewed author and VICE columnist Melissa Broder about her book. With the increasingly messy web of online data kept on us, it's hard to say if posting about "So Sad Today" multiple times, googling my own depressive symptoms (and clicking through to related pages), or purchasing a number of mental health related vitamins on Amazon led to an onslaught of mental health ads that persist to this day. What can be said is that its effect on my mental health is likely negative.

"Facebook user experience and usage have huge implications for our mental health and well-being," said Emilio Ferrara, an assistant research professor at the USC Department of Computer Science who studies social networks. "The exposure to these messages hasn't been extensively studied because it's challenging for researchers to access what ads people experience and are served. Companies are reluctant to share this information because the secret sauce is their business model."

Advertisement

According to my own therapist, the ads are likely making me more depressed—especially as they get more specific. As the winter set in last year, my depression was nearly back to full force, and I realized my meds may not have been working as well as I had thought. With a new job and new insurance, I finally sought help from an actual psychiatric professional, and it made a world of difference. She switched me to Effexor, which involved an exhausting process of slowly tapering off Wellbutrin and suffering through a few weeks of withdrawal, throughout which Facebook was there to tell me I'm mentally ill. This time I was bipolar. (See ad in timeline below for April 22, 2017.)

The timeline of some of the ads appearing in the author's Facebook and Instagram feeds. Image by Lia Kantrowitz

If we were to believe the Australian's report, it's possible Facebook knows I'm depressed from my online behavior and is preying on it to sell ad space—but the company has repeatedly denied this tool is accessible to advertisers. Facebook also says the content of what we post on the site or message to friends is not factored into the ads we see. Caroline Sinders, a machine-learning designer, said it is more likely I clicked through an ad on Facebook or Instagram and, noting its success, advertisers started showing me more of the same. Either way, Sinders said, the company has an ethical obligation to allow users to opt out.

"Having these ads go from meditation apps to 'are you bipolar?' is really dangerous and invasive," she said. "While algorithmically they may seem related to what was served up before, there is a lot of harm in the causal effects of how these things manifest." In fact, it's also against Facebook advertising policy, which does not allow ads to "assert or imply personal attributes," including medical illnesses. After being alerted to these advertisers this week, Facebook removed ads from companies Joyable and askaboutbipolardepression.

Advertisement

In December, Facebook also started testing a feature that allows users to block sensitive topics from its feeds. For now, it just allows users to filter content about alcohol, prompted by complaints from recovering alcoholics, but it is taking suggestions from users for a wider array of themes that could be triggering.

"We're always working to give people a better ads experience on Facebook and Instagram," a spokesman from Facebook told me. "We think letting support groups and health-related organizations reach people with important topics they might be interested in can be a good experience. However, we know we don't always get this right. People can hide and report certain ads they don't want to see, and we have policies in place to try to prevent bad advertising experiences."

As I work to straighten out what ads I see, it feels like the algorithm is undermining the progress I feel I've made. But as Sinders notes, I have essentially taught it to do so—and it's difficult for programmers to predict outcomes like this or fix them until they actually happen.

"There is no preventative care for harmful outcomes in machine learning," she said. "You just wait and hope and you can design out biases."

I've spent my entire life being depressed, and despite all the help I've gotten and the progress I've made in the past four months, often even I forget that it's OK to not feel like garbage all the time—how could I expect a machine to come to the same conclusion so quickly?

As I left a recent appointment, I heard my psychiatrist—a Russian woman in her 70s who once made me print out an article to read because she "doesn't do the internet"—shout at me to "stay off the Twitter and Facebook."

"I'm serious—that stuff is terrible for your mental health," she said. "People take what they are feeling and project it onto what they see online." I couldn't begin to tell her how right she was.

Follow Kari Paul on Twitter.