His Daughter's Murder Keeps Circulating on YouTube, So He's Filing an FTC Complaint

Reporter Alison Parker's father Andy is filing a complaint with the FTC for the way YouTube mishandled and misrepresented the process of getting videos of his daughter's murder scrubbed from the platform.
Alison Parker and Adam Ward memorial.
Image via Getty Images

In August 2015, 24-year-old reporter Alison Parker was murdered on live television, along with her cameraman, Adam Ward. Video from the broadcast, as well as video from a GoPro worn by the gunman, have been shared across YouTube thousands of times, by conspiracy theorists and users who want to draw views with the shocking, gruesome footage.

Today, Alison's Father, Andy Parker, filed a complaint (warning: contains disturbing images) with the Federal Trade Commission against Google and YouTube, claiming that the way the platform has handled videos of his daughter's death—and many other videos of brutal shootings and murders—deceives consumers.


Despite these videos clearly being against YouTube's community guidelines, which forbids images of graphic violence and death, many of these videos remain up on YouTube for years.

The complaint alleges that YouTube misrepresents how much violent content is on the site, and fails to tell users that the responsibility to get videos of victims' deaths taken down is on them—and most often, on the families of the victims themselves. Even when someone goes through the long, painstaking process of finding and reporting one of these videos, YouTube often doesn't even remove the video. The filing calls these practices "deceptively burdensome" and that the site "utterly fails" to follow through on promises to take down content in clear violation of its own terms of use.

"YouTube claims that it polices its platform for these violent and disturbing videos, when in truth it requires victims and their families to do the policing—reliving their worst moments over and over in order to curb the proliferation of these videos," the filing states. "In Mr. Parker’s case, even videos of his daughter’s murder that were uploaded on the day of her death—nearly five years ago—and have been reported repeatedly since then, remain on the site to this day."

“We specifically prohibit videos that aim to shock with violence, or accuse victims of public violent events of being part of a hoax," Google told Motherboard in a statement. "We rigorously enforce these policies using a combination of machine learning technology and human review and over the last few years, we’ve removed thousands of copies of this video for violating our policies.”


According to the filing, Andy Parker can't bear to watch hundreds or thousands of videos of his daughter's moment of death. YouTube requires flaggers to document specific timestamps noting where in the video the violence happens, along with written descriptions of that violence. A group of volunteers, led by a father of one of the children murdered in the Sandy Hook school shooting Lenny Pozner, help scan the platform for these videos and report them to YouTube.

Much like Pornhub's moderation practices, YouTube requires victims of abuse, harassment, and violence to seek out and flag the content depicting the most traumatic moment of their lives, in order to try to get it taken down. Also like Pornhub, YouTube removes videos that violate copyright—lest the company get sued for infringement. For grieving parents trying to scrub the site of their childrens' deaths, the process seems to be much more cumbersome.

Update: After we published this story, a Google spokesperson emailed us, taking issue with our claim that "Despite these videos clearly being against YouTube's community guidelines, which forbids images of graphic violence and death, many of these videos remain up on YouTube for years."

The spokesperson said that videos that violate its Community Guidelines are removed from YouTube but that the company makes exceptions for material with sufficient “EDSA” (educational, documentary / news, scientific or artistic) value. For example, news coverage that includes such footage.

We then sent Google three links to YouTube videos included in the FTC complaint: One conspiracy video alleging the shooting was a hoax, and two others that showed the video of the shooting. All have been hosted on YouTube since 2015.

Shortly after, YouTube told us it removed those videos for violating its Community Guidelines. When we asked why the videos have been allowed on YouTube until we sent the company the links, the spokesperson referred us back to its original statement, included above.