Apple Information iCloud Photos child safety scanning tools

[ad_1]

Apple has added more information about its upcoming plans to scan iCloud photos for child sexual abuse (CSAM) through users ’iPhones and iPads. Company published a new magazine deepening the security measures it hopes will increase user confidence in the initiative. This includes the rule that only images found in multiple child safety databases are tagged in different government organizations — in theory, preventing one country from adding non-CSAM content to the system.

Apple’s upcoming iOS and iPadOS releases will automatically match U.S. iCloud Photos accounts with a well-known CSAM file from an image diffusion list compiled by child safety groups. While many companies scan cloud services remotely, Apple’s device-based strategy is received sharp criticism encryption and privacy experts.

The paper, titled “Review of Apple’s Child Safety Features Security Risk Model,” hopes to alleviate privacy and security concerns about deployment. It’s based on a Wall Street Journal to interview With Apple CEO Craig Federigh presented some information this morning.

In the document, Apple says it does not rely on a single government-related database — such as the database of the National Center for Missing and Exploited Children (NCMEC) in the United States — to identify CSAM. Instead, it only corresponds to images of at least two groups with different national affiliations. The goal is that no government could have the power to secretly add irrelevant content for censorship because it does not match decentralization in any other database.

Venn diagram with one circle with notation

Apple

Apple has referred to the possible use of several child safety databases, but to date, it has not explained the overlap. In a call with suppliers, Apple said it would only name NCMEC because it has not yet entered into agreements with other groups.

The magazine confirms the detail mentioned by Federigh: initially, Apple will only tag an iCloud account if it recognizes 30 images in CSAM format. This threshold was chosen to provide a “steep margin of safety” to avoid false positive results, the paper says – and when it evaluates system performance in the real world, “we can change the threshold”.

It also provides further information on the inspection system mentioned by Federigh. Apple’s list of known CSAM seals is being cut for iOS and iPadOS worldwide, even though the scanning system is currently only available in the United States. Apple provides a complete list of decentralizations that inspectors can check based on child safety databases. Another way to make sure it doesn’t secretly match other images. In addition, it says it “rejects all requests” to moderators to report “any non-CSAM material” to accounts marked as flagged – referring to the possibility of using this system for other types of control.

Federighi admitted that Apple had added to the “confusion” with its announcement last week. But Apple has stayed ahead of the update itself – it tells reporters that while it’s still finalizing and repeating the details, it hasn’t changed its launch plans in response to last week’s criticism.

Leave feedback about this

  • Rating

Flying in Style: Explore the World’s Tiniest Jets! How Fast Is a Private Flight? Master the Skies with Your Private Jet License with Easy Steps! Top 8 Best Private Jet Companies Your Ultimate Guide to Private Jet Memberships!