In August 2015, 24-year-old reporter Alison Parker was murdered on live television, along with her cameraman, Adam Ward. Video from the broadcast, as well as video from a GoPro worn by the gunman, have been shared across YouTube thousands of times, by conspiracy theorists and users who want to draw views with the shocking, gruesome footage.
Today, Alison's Father, Andy Parker, filed a complaint with the Federal Trade Commission against Google and YouTube, claiming that the way the platform has handled videos of his daughter's death—and many other videos of brutal shootings and murders—deceives consumers.
Despite these videos clearly being against YouTube's community guidelines, which forbids images of graphic violence and death, many of these videos remain up on YouTube for years.
"YouTube claims that it polices its platform for these violent and disturbing videos, when in truth it requires victims and their families to do the policing—reliving their worst moments over and over in order to curb the proliferation of these videos," the filing states. "In Mr. Parker’s case, even videos of his daughter’s murder that were uploaded on the day of her death—nearly five years ago—and have been reported repeatedly since then, remain on the site to this day."
“We specifically prohibit videos that aim to shock with violence, or accuse victims of public violent events of being part of a hoax," Google told Motherboard in a statement. "We rigorously enforce these policies using a combination of machine learning technology and human review and over the last few years, we’ve removed thousands of copies of this video for violating our policies.”
According to the filing, Andy Parker can't bear to watch hundreds or thousands of videos of his daughter's moment of death. YouTube requires flaggers to document specific timestamps noting where in the video the violence happens, along with written descriptions of that violence. A group of volunteers, led by a father of one of the children murdered in the Sandy Hook school shooting Lenny Pozner, help scan the platform for these videos and report them to YouTube.
Much like Pornhub's moderation practices, YouTube requires victims of abuse, harassment, and violence to seek out and flag the content depicting the most traumatic moment of their lives, in order to try to get it taken down. Also like Pornhub, YouTube removes videos that violate copyright—lest the company get sued for infringement. For grieving parents trying to scrub the site of their childrens' deaths, the process seems to be much more cumbersome.
This article originally appeared on VICE US.