Tech

Apple Introduces Parental Control Feature That Scans Messages for Nudity

The feature will use machine learning to detect nudity in incoming and outgoing messages sent by children, but is concerning some experts.
Apple logo
Image: Drew Angerer/Getty Images
Screen Shot 2021-02-24 at 3
Hacking. Disinformation. Surveillance. CYBER is Motherboard's podcast and reporting on the dark underbelly of the internet.

Apple plans to introduce a new feature that would scan messages sent to and by child users of iPhones to determine if the images contain nudity, the company announced on Thursday. The move is a major development in the ongoing debate around privacy and the inspection of communications.

Whereas previous features from other tech giants designed to detect child abuse images compare the cryptographic fingerprint of images to a list of known child abuse material, the new feature in the Messages app uses machine learning to determine whether a photo likely contains sexual material or not for all images. Some tech companies do aim to detect nudity in other forms of content. Facebook scans public posts for nudity, with varying degrees of success. Apple's planned feature concerns private messages sent between devices, and not publicly shared material.

Advertisement

"Messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit. The feature is designed so that Apple does not get access to the messages," a technical document explaining Apple's new features reads. Apple said in the document that the new feature "will enable parents to play a more informed role in helping their children navigate communication online."

Do you know anything else about a company scanning user content? We'd love to hear from you. Using a non-work phone or computer, you can contact Joseph Cox securely on Signal on +44 20 8133 5190, Wickr on josephcox, OTR chat on jfcox@jabber.ccc.de, or email joseph.cox@vice.com.

The document adds that this image scanning can be applied to images both received and sent by a child. After receiving a suspected sexually explicit image, the iPhone will blur the photo itself and the child will be warned that the image could be sensitive to view, according to screenshots included in the document. The system can also notify the child's parents if the child decides to view the image, as well as if the child tries to send sexually explicit images themselves, the document adds.

Advertisement

The document says this change is coming in an update later this year to accounts set up as families in iCloud for iOS15, iPad OS15, and macOS Monterey.

apple-scan.png

A screenshot of the new feature. Image: Apple.

The move has concerned some security experts.

"Obviously the application presented is a good thing. But this is still an incredibly powerful technology demonstration, showing that even end to end encrypted photos can be subject to sophisticated scanning," Matthew Green, who teaches cryptography at Johns Hopkins University, told Motherboard in an online chat. "This scanning is benign, and when a picture is detected only the parent of the affected individual will be notified. But it shows that Apple is willing to build and deploy this technology. I hope that they will never be asked to use it for other purposes."

An oft-repeated concern from cryptographers and technologists is that tools developed for one purpose may be redeployed or tweaked for something else entirely. Although not directly comparable, in 2016 Reuters reported that Yahoo scanned incoming emails for specific information provided by U.S. intelligence officials; tech giants have scanned content stored in cloud services for known child abuse material for years.

Worries about Apple's new features first rose on Wednesday when Green tweeted some details about another of Apple's plans that will compare the fingerprint of images to a list of already known child abuse material. The Financial Times then reported some specifics of the planned feature, but did not report on the Messages app scanning in detail.

Nicholas Weaver, senior researcher at the International Computer Science Institute at UC Berkeley, told Motherboard in an online chat that Apple's new Messages feature "seems like a VERY good idea."

"Overall Apple's approach seems well thought through to be effective while maximizing privacy," he added.

Apple did not respond to a specific set of questions about the Messages scanning feature.