FYI.

This story is over 5 years old.

Health

Do Streaming Services Need Better Trigger Warnings?

After the recent episode of 'Crazy Ex-Girlfriend' had no content warning or offer of a helpline, how we frame triggering TV needs addressing.
Hannah Ewens
London, GB
Image: 'Crazy Ex-Girlfriend' / Netflix

Whatever darkness Rachel Bloom was going to pull Crazy Ex-Girlfriend through, she was always going to do it well.

The recent move to explore suicide was intelligently and thoughtfully done – and, on some level, shouldn't have come as a surprise. In the first season of Netflix's musical comedy, references are made to a suicide attempt and hospitalisation, and throughout this third season Bloom's character Rebecca has been falling out with friends and making increasingly regrettable choices as she feels unhappier and less in control than ever before.

Advertisement

In the first half of the most recent episode, "I Never Want To See Josh Again", we’re alerted to her suicidal thoughts. Yet, in the second half of the episode, when she makes the decision to end her life, it’s a shock, tonally. There’s no imaginary musical number, as is Rebecca’s usual method of working through her anxieties; no snap back to a gag, a device CEG uses in excess. Nothing about the show up to this point would make you think they'd go there – at least, not in this way. When the episode thuds to a halt as Rebecca blacks out (importantly, they gave her a change of heart and a last minute reach out to a passerby for help, rather than leaving it on a total sensationalist cliffhanger), I couldn't help but wonder if some people would have chosen to avoid watching the episode if they’d known its content.

On UK Netflix there was no mention of distressing subject matter in the ironically comforting episode description, no warning at the opening, no offer of helplines at the end.

Rachel Bloom understood the power of this episode to move and trigger, and gave her followers a heads-up on her social media platforms that this would be a difficult one to watch. Someone on Twitter asked how she thought this episode would affect people in terms of their pasts and triggers, and what she did to combat that. She replied "we tried to give people the head’s up on social media today and the end title card also addresses that". In the US, at least, there was a black card with charity information.

Advertisement

But why the disparity? If this were on the BBC, this episode would have been topped and tailed with warnings and helplines. Some soaps invite mental health charities to advise them on everything from developing storylines, to on set acting, to appropriate warnings.


WATCH: Institutionalised – Mental Health Behind Bars


The primary issue with streaming services is that they’re accessed in different countries all over the world, each with their own sets of guidelines or rules. OFCOM has no specific rules for broadcasters, just guidelines. This means they are free to make their own editorial decisions over issues such as warnings. Catherine Anderson, Head of Communications at the British Board of Film Classification (BBFC), says Netflix submits its shows and films to them for classification for a UK audience on a voluntary basis. "Most of their original content, they’ll send to us," she says. "So you’ll find a mix of their own ratings and BBFC ratings." Whether there is an episode-specific or more general series content description depends on whether they send the show in bulk or each episode individually.

Crazy Ex-Girlfriend was one show not submitted, says Anderson, and instead given a G rating by Netflix itself, i.e. "parental guidance suggested".

"The broadcaster has a responsibility to triggered individuals," says Clare Keeling at mental health charity Rethink Mental Illness, over the phone. "If TV broadcasters have upheld that for years, why can’t streaming services? Seeing the switch to online TV and streaming services, warnings and black cards just aren’t being transferred." This is particularly perplexing given that the discussion about trigger warnings has developed online and been met with relative derision from older generations offline.

Advertisement

Proof that these measures are needed was provided earlier this year with Netflix original 13 Reasons Why. The show was enormously popular with teens but slammed by mental health professionals and many critics for glamorising suicide. "A lot of people found that graphic and affecting, and originally it had no warning or topping and tailing," says Keeling. "This was particularly bad, since the depiction of suicide wasn’t sensitive. Understandably, there was a big pushback."

This separation of object and responsibility is no doubt due to streaming's role in television now being appreciated as an elevated art form. Solo Netflix binges are practically a celebrated cultural night in, whether you’re watching The Crown or a load of Australians trying to make blancmange for prize money. As with films, it feels like there's a desire on the part of creators and streaming services to provide minimal explanation, giving critics and viewers more to debate. Does Lars Von Trier have a Samaritans helpline at the end of his films? No. Plus, episode content warnings flag material, and if there's anything TV hates it's a spoiler.

Like a trip to the cinema, there is the expectation with a streamed show that you’re actively choosing to watch it, so understand what you're in for, rather than just stumbling across something stressful as Channel 4 plays in the background, say. But that’s not how our low-energy, high-consumption viewing habits work with streaming services. We watch stuff because it’s there. Sometimes we don't know what a show is going to contain until the scenes play out in front of us – but we should be given that information if we might need it. Online, there are greater opportunities for interactivity and episode descriptions, innovative ways of protecting viewers that don’t exist at the cinema or on terrestrial television.

Advertisement

"A single word in an episode description gives that viewer the opportunity to turn away or wait to watch it with a friend, and a single line at the end of a show won't bother the average viewer."

But perhaps a secondary part of this problem is that having a sensitive and useful conversation around suicide is incredibly difficult. When I contacted Samaritans for this article, they didn’t want to get involved, since drawing attention to suicide in the media is seen as dangerous – the thinking going that more people could end up watching an episode of triggering television that they would have otherwise.

Keeling thinks the initial onslaught of online articles about 13 Reasons Why wrongly praising it was dangerous. "The second lot of pieces that were critiquing how it portrayed suicide were really important to discuss," she says. "We should be able to talk about suicide because it affects so many people, and we should be able to talk about how seeing it affects us, and how portrayals of it affect us. In fact, those 13 Reasons Why discussions were what meant that Netflix eventually added in some support for viewers."

This doesn't mean episodes like this one of Crazy Ex-Girlfriend shouldn't exist – quite the contrary. Mental illness and suicide will continue to be mined for storylines in these increasingly nuanced ways. That’s a crucial part of developing our understanding of mental health and increasing representation onscreen. The framework all this happens within, though, must be developed.

Advertisement

At the end of next year, BBFC will conduct an Online Attitudes survey that will provide a lens for the way they classify shows for Netflix. "We'll ask 10,000 people if they want the same level of regulation online as offline," BBFC’s Catherine Anderson says. "We’ll definitely be touching on 13 Reasons Why and [Netflix's anorexia film] To The Bone, as they’ve been of such debate."

Lying in bed when you’re unwell as a potentially triggering scene fades to black is dangerous. It’s the same interactivity provided by laptops that could revolutionise trigger warnings that also means you’re left alone in that state with a waiting search bar. However, a single word in an episode description gives that viewer the opportunity to turn away or wait to watch it with a friend, and a single line at the end of a show won't bother the average viewer. If the creator of the show thinks it's important to flag potentially distressing content, it should remain important all the way down the chain to us.

If you’re affected by this article or the themes of the TV discussed please contact Samaritans here.

Netflix did not respond to requests for comment.

@hannahrosewens