Timothy J. Aveni explicitly linked his resignation to Facebook’s refusal to act on posts from Trump which call for violence.
"Is it the right approach to deplatform these individuals? Is the right approach to try and engage with these individuals? How should we be thinking about this? What actually works?"
After no one watched streams for Valve-created game Artifact, some users started their own meme streams. Then other content seeped in.
YouTube left the video online for over two days, allowing it to generate tens of thousands of views and spread to other channels.
The videos on Facebook and Instagram show sections of the raw Christchurch attack footage, and variations continue to thwart Facebook's moderators and technology.
It took 29 minutes for a Facebook user to first report the livestream of the Christchurch terrorist. Now a machine learning system spots weapons in the stream with an over 90 percent confidence rating.
Following a Motherboard investigation, Facebook banned white nationalism and white separatism. But Twitter and YouTube, two platforms with their own nationalism problems, won’t commit to following Facebook’s lead.
After a civil rights backlash, Facebook will now treat white nationalism and separatism the same as white supremacy, and will direct users who try to post that content to a nonprofit that helps people leave hate groups.
Google saw an issue with moderating the Christchurch terrorist’s so-called manifesto in part because of its length, telling moderators to mark potential copies or sections as “Terrorist Content” if they were unsure.
On Friday, at least 49 people were killed in terror attacks in New Zealand. Documents, sources, and interviews with senior Facebook employees show how difficult it is for social media companies to moderate live footage.
Facebook's former head of training talks about how the company decides whether a person is cut out to look at hateful, violent, and graphic content all day.
Internal Facebook documents obtained by Motherboard show specific steps and strategies taken by the company to fight content moderation issues that may spike during an election season.