This Disturbing TikTok 'POV' Trend Is All About Domestic Abuse

Teens aren't just miming along to pop songs – they're simulating abuse and violence.
TikTok POV domestic abuse trend
Photo: Jennifer Jordan

One video begins with a teenager cheerfully applying blush and putting on jewellery – before cutting to her staring despondently past the camera, the makeup on her face now simulating bruises and a black eye. The caption: "You go on a date with your abusive boyfriend and it doesn’t end well."

In another, a smiling girl enters a dark room. The description tells us she’s been pushed there to "hook up" with the jock, but soon realises there are multiple boys waiting for her. She lip-syncs with faux nervousness to a song playing over the clip, then acts out trying to escape the room before the video cuts to black. This is life on the unsettling side of "POV" (point of view) videos, one of social video platform TikTok’s most popular trends.


While many other posts in this category are harmless – for example, boys pretending to ask the person watching on a date, someone comforting the viewer after noticing they’re crying – these darker videos make for a difficult watch. They’re also insanely popular. One POV video, in which a boy plays a "psychotic boyfriend" who berates you for going out without telling him, has almost 230,000 likes. Another, of a girl who pretends to have shot her boyfriend out of jealousy, has 270,000.

So why is this happening?

TikTok user Jennifer Jordan says raising awareness of domestic abuse was the motivation behind her video, which is the one referenced at the start of this piece.

“Even though a lot of people were upset, I was flooded with even more comments defending and supporting me,” says Jennifer. “People with personal stories were coming out and thanking me for raising awareness of the topic, and that‘s exactly what I posted the video for.”

According to 2019 data from the Crime Survey for England and Wales, women aged 20 to 24 years are significantly more likely to be victims of domestic abuse than women aged 25 years and over – TikTok's users are young and in the former category. Could it be that young women and men aren't receiving adequate education about consent and abuse, and are taking the job of educating young people into their own hands?

One TikTok user, who asked not to be named, says her video – in which she pretends to have murdered the family of her unfaithful boyfriend – was inspired by “the creativity of other POV makers”, who link the plots of their videos to the lyrics of songs.


“To anyone that may argue POVs could normalise this type of behaviour, I would say that the videos actually help to spread awareness,” she says. “Since people going through abuse may feel alone, having a POV dedicated to their situation prevents them from feeling so isolated.”

Nadia Mahmud, board member of domestic abuse charity Woman’s Trust, believes it's important to teach people from a young age about the types of behaviour that are healthy and unhealthy in relationships. "This," she says, "is they can spot signs early on and keep themselves, and their friends, safe.”

But while a lack of education might explain why this content exists, it could also be triggering – especially for users who aren't looking out for it. TikTok’s format makes it difficult to protect users – a large portion of whom are children and teenagers – from harmful or potentially triggering content. You can view content on the TikTok app by searching hashtags, keywords, and users, but there is also the For You page, where videos recommended for you are displayed. This functions like a carousel; a new video begins playing as soon as the previous one has finished, and there’s no way of knowing what video might appear next.

After viewing some of the videos in question, a spokesperson for TikTok said that because they’re fictional, the posts do not breach community guidelines. The spokesperson added that a Restricted Mode can be activated in order to filter inappropriate content out of users’ feeds, but would not explain how the app identifies videos as being inappropriate or not.

“The platforms may not be responsible for the actual footage or message, but they facilitate the rapid spread of unmoderated content,” says Nadia. “Perhaps more crucially, by the time it’s been flagged, the damage has already been done.”

Social media platforms have a responsibility to regulate content and give users the option to protect themselves from harmful posts. Perhaps if more young people were able to identify abuse in all of its forms, and were aware of its prevalence in the real world, such disturbing videos wouldn’t be making their way online in the first place. Then again, maybe teens are taking education into their hands, in their own very online way.