As some subreddits reopen following protests against Reddit's new API fees, mods are allowing porn in their communities.
'Take It Down,' a new initiative by the National Center for Missing and Exploited Children, creates a hash from non-consensual images of minors without requiring them to upload anything.
Researchers have found that racist tweets targeting footballers have been allowed to remain on the platform despite being reported to Twitter.
As hate speech surges and the future of the platform seems to be in the balance, the people who helped advise Twitter on user safety say they don’t know what lies ahead.
A new report examining the YouTube consumption of the Buffalo shooter shows how integral the video streaming site was to his preparations.
Twitter removed an account dedicated to spreading fake nudes of cosplayers, but it's a longstanding problem for the community.
A list of more than 160 companies shows who reported the most child sexual abuse content, including Facebook, Twitter, and Pornhub's parent company, Mindgeek.
Following bans from payment processors and mass deletion of most of its content, Pornhub announced new details on its improved trust and safety polices.
A former contracted content moderator at YouTube is bringing a class action lawsuit against the platform, claiming the job gave her severe PTSD, anxiety and depression.
The Reddit co-founder and former board member is in the same position as the rest of us now: reporting abuse from the outside and hoping for the best.
A conservative mindset primed for conspiracy and Twitter's opaque moderation methods are a recipe for a scandal.