Policy groups are asking Apple to abandon plans to scan devices for child abuse

[ad_1]

An international coalition of political and civil rights organizations issued an open letter on Thursday asking Apple to “abandon its recently announced plans to build surveillance capabilities for the iPhone, iPad and other Apple products.” The groups include the American Civil Liberties Union, the Electronic Frontier Foundation, Access Now, Privacy International, and the Tor Project.

Earlier this month, Apple announced its plans use new technology within iOS to detect potential child abuse images with the goal of limiting the spread of child sexual abuse material (CSAM) online. Apple also announced a new “communications security” feature that uses device machine learning to identify and blur the sexual image that children receive in their Messages app. Parents of children 12 years of age and younger may be notified if the child is viewing or posting such a picture.

“While these features are designed to protect children and reduce the spread of child sexual abuse material, we are concerned that they will be used for censored speech, threaten people’s privacy and safety around the world, and have devastating consequences for many children,” the groups wrote. letter.

Apple new “Child Safety” page tells you which plans require the device to be scanned before the image is backed up to iCloud. The scan will only be performed after the file is backed up to iCloud, and Apple says it will only receive match information if the account encryption coupons (uploaded to iCloud with the image) meet the known CSAM threshold. Apple and other cloud providers have used compression systems to find CSAM files sent by email, but the new program applies the same scans to images stored in iCloud, even if the user never shares or sends them to anyone else.

In response to concerns about the misuse of technology, Apple followed by saying it would limit its use to CSAM detection “and we do not accept any government request to expand it,” the company said.

Much of the opposition to the new measures has focused on the device’s scanning feature, but civil rights and privacy groups said the plan to obscure children’s iMessages could endanger children and break iMessage’s end-to-end encryption.

“With this backdoor feature built in, governments can force Apple to extend the notification to other accounts and detect images that are objectionable for reasons other than sexual,” the letter states.

Leave a Reply

Your email address will not be published. Required fields are marked *