A former contracted content moderator at YouTube is bringing a class action lawsuit against the platform, claiming the job gave her severe PTSD, anxiety and depression.
The Reddit co-founder and former board member is in the same position as the rest of us now: reporting abuse from the outside and hoping for the best.
A conservative mindset primed for conspiracy and Twitter's opaque moderation methods are a recipe for a scandal.
Moderation can help you reach your goals, but only if you're precise about what it means to be moderate.
A small community of people is dedicated to reviving DeepNude, a program that creates non-consensual nudes of women.
The artist who made this week's first Zuckerberg deepfake made another, in protest of the platform supressing his work.
Live Action posted "misinformation related to conspiracies and health," Pinterest said. It continues to thrive on Facebook, Instagram, and YouTube.
Minds is home to neo-Nazis, and wants its users to help decide what content stays on the site.
YouTube Livestream of Congressional Hearing About ‘Rise of White Nationalism’ Is Filled With White Nationalism
The live comment section of the YouTube live stream of the House Judiciary hearing on hate crimes and white nationalism was filled with hateful trolls.
Google saw an issue with moderating the Christchurch terrorist’s so-called manifesto in part because of its length, telling moderators to mark potential copies or sections as “Terrorist Content” if they were unsure.
“Tune” is an app that uses machine learning to filter what it thinks is “toxic” content, but it’s far from perfect.