Apple reveals new efforts to combat child abuse

[ad_1]

Apple confirmed at a news conference on Thursday afternoon previously announced plans introduces new technology on iOS, macOS, watchOS, and iMessage devices that detect potential images of child abuse but clarify important details of an ongoing project. New versions of iOS and iPadOS to be released in the U.S. this fall include “new encryption software to limit CSAM spread” [child sexual abuse material] online while planning user privacy. ”

The project is also detailed a new “Child Safety” page on Apple’s website. The most invasive and potentially controversial implementation is a system that scans the device before the image is backed up to iCloud. According to the description, the scan will not take place until the file is backed up to iCloud, and Apple will only receive match information if the encryption coupons for a particular account (uploaded to iCloud with the image) meet a known CSAM threshold.

For years, Apple has been looking for hash systems to search for child abuse images sent via email, using similar systems from Gmail and other cloud providers. The program released today applies the same scans to user images stored in iCloud photos, even if the images are never sent to another user or otherwise shared.

Inside something PDF In conjunction with the press conference, Apple justified its measures to scan images by describing several restrictions that are included in the protection of privacy:

Apple doesn’t learn anything from images that don’t match a well-known CSAM image

database.

Apple may use metadata or visual derivatives of similar CSAM images only when

the hit limit is exceeded in the iCloud Photos account.

The risk of the system incorrectly marking the account is very small. In addition,

Apple will manually review all reports submitted to NCMEC to ensure reporting accuracy.

Users cannot access or view the database of known CSAM images.

Users cannot identify which images the system has tagged in CSAM format

The new details are based on concerns leaked earlier this week, but they also add a set of safeguards that should protect against the privacy risks of such a system. In particular, the threshold system ensures that lone errors do not create alarms, allowing the apple to target one false alarm percentage per trillion users per year. The distribution system is also limited to material labeled by the National Center for Missing and Exploited Children (NCMEC) and images uploaded to iCloud Photos. Once an alert is created, Apple and NCMEC review it before alerting law enforcement and providing additional protection against the system being used.

Apple commissioned technical assessments of the system from three independent cryptocurrencies (PDF files 1, 2and 3), which it found to be mathematically sound. “I think this system is likely to significantly increase the likelihood that such images will be owned or traffic (malicious users) will be found. This should help protect children, ”said Professor David Forsyth, director of computer science at the University of Illinois. one of the estimates. “The accuracy of the matching system, along with the threshold, makes it highly unlikely that images that are not known CSAM images will be exposed.”

However, Apple said other child safety groups are likely to be added as hash sources as the program expands, and the company is not committed to making the partner list publicly available in the future. This is likely to raise concerns about how the Chinese government could take advantage of a system that has long sought to gain better access to iPhone user data in the country.

Warning messages to children and parents when sexual images are detected

Warning messages to children and parents when sexual images are detected
Photo: Apple

In addition to new measures by ICloud Photos, Apple added two additional systems to protect young iPhone owners who are at risk of falling victim to child abuse. Messages has already scanned child image image attachments on your device for potentially sexual content. When the content is detected, it is blurry and a warning is displayed. A new setting that parents can apply to their family’s iCloud account triggers a message telling the child that if they view (arrive) or send (depart) the detected image, the parents will receive a message about it.

Apple is also updating how Siri and the Search App respond to queries about child abuse images. In the new system, applications “explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this problem”.

Leave feedback about this

  • Rating

Flying in Style: Explore the World’s Tiniest Jets! How Fast Is a Private Flight? Master the Skies with Your Private Jet License with Easy Steps! Top 8 Best Private Jet Companies Your Ultimate Guide to Private Jet Memberships!