Craig Federighi clarifies the details of Apple’s child safety check

[ad_1]

Apple CEO Craig Federighi says iCloud Photos intends to scan material for the sexual exploitation of children (or CSAM) includes “multiple levels of audit.” In with the interview Wall Street JournalFederighi, Apple’s vice president of software technology, provided new insights into his controversial child protection practices. This includes the claim that a device-level scan of the iPad and iPhone will help security experts ensure that Apple uses the system responsibly.

Like many companies with cloud storage services, Apple compares iCloud Photos to the National Center for Missing and Exploited Children (NCMEC) list and looks for exact matches with known CSAM images. But unlike many services, it performs searches on the device, not completely remotely. “Imagine someone scanning images in the cloud. Well, who knows what’s being scanned?” Federighi said, referring to remote scans. “In our case, the database is sent to the device. People can see, and it’s one image in all countries.”

Federighi clarified a little how this could give people confidence. Apple will not greatly expand the database to include material in addition to illegal CSAM material, especially in countries where censorship policies are restrictive.

“We deliver the same software in China with the same database we deliver in the Americas as well as in Europe. If anyone came to Apple [with a request to scan for data beyond CSAM], Apple would say no. But let’s say you’re not confident. You don’t just want to trust Apple to say no. You want to be sure Apple won’t get rid of it if we say yes, ”he said Magazine. “There are several levels of auditing, and that’s why we make sure you don’t have to rely on any unit or even any country insofar as the images are part of that process.”

Apple has previously said it will introduce the system only in the United States and that it will consider launching in other countries on a case-by-case basis. The company confirmed Limit that Apple will supply a well-known CSAM hash database for the operating system in all countries, but will only be used for scanning in the United States. The Magazine further clarifies that there is an independent inspector who can inspect those images.

Federighi also provided more information about when the scanning system will notify the Apple administrator of any illegal content. Apple has previously said that one match will not trigger a red flag – a measure designed to prevent false positives. Instead, the system creates “security coupons” for each match and notifies Apple if the number reaches a certain threshold. Apple has refused to disclose the exact threshold, saying this could allow abusers to circumvent detection. But Federighi says it is “a class of 30 well-known child pornography images”.

Some security experts have cautiously praised Apple’s system and recognized the importance of finding CSAM online. But many have criticized Apple’s sudden introduction and ambiguity in the functioning of the system. In his interview Magazine, Federighi admitted confusion. “It’s really clear that a lot of the messages were mixed pretty badly in terms of how things were understood,” he said.

Leave a Reply

Your email address will not be published. Required fields are marked *