Facebook has admitted for the first time that its plans to roll out end-to-end encryption across all its platforms will be a boon for child abusers.
For months, children’s charities, lawmakers, and even Facebook’s own shareholders have been warning about the dangers of the company’s plan, but until now the social media giant has refused to admit it would cause a problem.
That changed this week, UK lawmaker Yvette Cooper asked Facebook’s head of global policy management to estimate the number of cases that would “disappear” under the planned move to end-to-end encryption.
“I don’t know the answer to that,” Monica Bickert told the House of Commons Home Affairs committee. “I would expect the numbers to go down. If content is being shared and we don’t have access to that content, if it’s content we cannot see then it’s content we cannot report.”
Facebook currently accounts for 94% of all online child abuse reports, according to the U.S. National Center for Missing and Exploited Children (NCMEC). The agency has estimated that 70% of its reporting — some 12 million reports globally — would disappear under Facebook’s new rules.
Rolling out end-to-end encryption to products like Facebook Messenger and Instagram would make it impossible for the company to automatically detect and report child abuse content as it does today.
“Why on earth—why, seriously, why is Facebook trying to introduce something that will put more children at risk, that will make it harder to rescue vulnerable children? Why are you doing this?” a disbelieving Cooper asked Bickert after the Facebook executive admitted the move would make it easier for child abusers to share content.
When Facebook CEO Mark Zuckerberg announced the plan in 2019, he called it “a privacy-focused vision for social networking” and framed it as a victory for privacy advocates after years of criticism about Facebook’s failure to protect its users’ information.
Bickert repeated those talking points when trying to defend the decision.
“I spent my background as a prosecutor working on cases like violent offenses to children and human trafficking offenses, but I also want to be mindful of all the different types of abuse that we see online,” Bickert said.
“I don’t think there’s a very clear answer on how to keep people the most safe most of the time. This is something also that governments have struggled for as long as I have studied or been aware of it.”
But those working to protect children online say Facebook’s plan to forge ahead with encrypting all its platforms is very dangerous.
“Facebook has repeatedly insisted that their ability to detect child abuse material and grooming will be unaffected — until this week,” Anne Longfield, Children’s Commissioner for England, told VICE News. “Last month I called on any company looking to go end-to-end encrypted to call a halt to its plans until it can guarantee that its ability to protect children will not be impacted. With this admission, Facebook now has no excuse not to pause and urgently revise its plans.”
Facebook defended its plan, telling VICE News, that “end-to-end encryption is essential to protect everyone’s privacy and security, including children.”
It also pointed out that even though WhatsApp is fully encrypted it bans “around 250,000 accounts each month suspected of sharing child exploitative imagery and makes many thousands of proactive reports to NCMEC each year.”
However, the scale of the problem is growing exponentially.
Between 2018 and 2019, the number of files — images, videos, and other material related to child sexual exploitation — reported to the NCMEC’s CyberTipline jumped from 45 million to 69.1 million, an increase of over 50% in just 12 months.
Critics are not convinced by the company’s claims it can combine encryption with continuing to detect child abuse content and instead say that Facebook’s “platforms and design choices put children at risk.”
“[Bickert’s] comments are a remarkable admission of where Facebook’s priorities lie, and it’s abundantly clear child safety is not at the top of the list,” Andy Burrows, who heads up online child safety policy at the National Society for the Protection of Cruelty to Children (NSPCC) told VICE News.
“Their seemingly cavalier approach to child safety underlines precisely why we need a legal duty for care on tech firms that repeatedly fail to protect their young users.”