Apple plans to scan photos of child abuse stored on iPhone and iCloud, by Financial Times. The new system could help law enforcement in criminal investigations, but it could open the door to increased legal and administrative requirements for user data.
The system, called neuralMatch, “proactively warns a team of human assessors if it believes an illegal image is detected, and who then contacts law enforcement if the material can be verified.” Financial Times said. NeuralMatch, trained using 200,000 images of the National Center for Missing and Exploited Children, will first be released in the United States. The photos are condensed and compared to a database of images of child sexual abuse.
“According to people informed of the plans, every photo uploaded to iCloud in the United States will be given a” security coupon “stating whether it is suspected or not. Financial Times said. “Once a certain number of photos have been flagged as suspicious, Apple will allow all suspected images to be decrypted and, if they appear to be illegal, forwarded to the appropriate authorities.”
Matthew Green, a professor and cryptographer at Johns Hopkins University, raised concerns about the system on Twitter on Wednesday night. “Such a tool can be a blessing for finding child pornography on people’s phones” Green said. “But imagine what it could do in the hands of an authoritarian government?”
“Even if you believe that Apple does not allow the misuse of these tools [crossed fingers emoji] there is still much to worry about ” he added. “These systems are based on a database of ‘problematic media distributions’ that you as a consumer cannot check.”
Apple already scans iCloud files for known child abuse images, like all other major cloud providers. But the system described here would go further and allow access to local storage. It would also be trivial to extend the system to crimes other than child exploitation – this is particularly worrying given Apple’s extensive business in China.
The company reported the matter to some U.S. scientists this week, and Apple can tell more about the system “right this week,” say two security researchers who were told about Apple’s previous meeting. Financial Times reports.
Apple is in the past advertised privacy protections built into its devices and famously opposed the FBI when the agency wanted Apple to build a backdoor into iOS to get the iPhone that one of the shooters used in the 2015 attack in San Bernardino. The company did not respond to the request for comment Financial Times report.