Apple VP acknowledges its concern about the new scanning feature of the internal memo

[ad_1]

Apple future feature Scanning iOS devices from child abuse images is an “important task,” the company’s software manager wrote in an internal memo. First report 9 – 5 Mac, Sebastian Marineau-Mes’ memo acknowledges that some people are “concerned about the consequences” of the new protections, but that the company “maintains Apple’s deep commitment to user privacy.”

As part of extended child protection Apple plans to scan the images on iPhone and other devices before downloading them to iCloud. If it finds an image that matches one in the database of the National Center for Missing and Exploited Children (NCMEC), an Apple person will check the image to see if it contains child pornography. If confirmed, NCMEC will be notified and the user account will be disabled.

The announcement raised concerns among privacy advocates who questioned how Apple could prevent the system from exploiting bad players. Electronic Frontier Foundation said in a statement that “it is impossible to build a client-side scanning system that can only be used for sexual images sent or received by children” and that the system, even if well designed, “breaks key promises of messenger encryption and opens the door to wider abuse.”

By 9–5 Mac, Marineau-Mes wrote in a memo that the project included a “deep operational commitment” throughout the company that “provides tools to protect children, but also maintains Apple’s deep commitment to user privacy.”

Apple did not respond immediately to a request for comment on Friday.

Leave a Reply

Your email address will not be published. Required fields are marked *