Last week, Apple announced without any special warning iPhone has a new set of tools designed to protect children from abuse. Siri now provides resources for people who request or want to report child abuse material. iMessage now marks naked children sent or received under the age of 13 and warns their parents. Images backed up to ICloud Photos are now compared to a database of known child sexual abuse material (CSAM) and reported to the National Center for Missing and Exploited Children (NCMEC) if more than a certain number of images match. And this reconciliation process doesn’t just happen in the cloud – part of it happens locally on your phone. It’s a big change in the way things work normally.
Apple claims that saying it is a much more private process that involves scanning images on your phone. And this is a very big limit to cross – basically, the iPhone operating system now has the ability to view your photos and compare them to a database of illegal content, and you can’t remove that feature. And while we would all agree that adding this feature is justified in the face of child abuse, there are huge questions about what happens when governments around the world, from the UK to China, ask Apple to respond to other types of images – terrorist content, images of demonstrations, images dictators who look silly. Such claims are routinely made around the world. And so far, no part of it happened in your phone pocket.
To unpack all this, I asked Riana Pfefferkorn and Jennifer King to join the program. They are both Stanford researchers: Riana specializes in cryptographic policies, while Jen specializes in privacy and data protection policies. He has also worked on child abuse issues in large technology companies in the past.
I think for a company with as much power and influence as Apple, earning a system that changes an important part of our relationship to our personal devices deserves a thorough and often explanation. I hope the company explains more about what it does, and soon.
The following excerpt has been slightly modified for clarity.
It seems that one huge part of this whole controversy is that the scan is performed on the device at some point. This is a Rubicon that has been overrun: so far, the local computer has not scanned the local storage in any way. But when you hit the cloud, all sorts of scans happen. It’s problematic, but it happens.
But we haven’t yet gotten to the point where law enforcement is forcing the company to do local scans on your phone or computer. Is it that big bright line that is causing all the problem?
Riana Pfefferkorn: I see this as a paradigm shift to take where scanning comes from the cloud, where you do selection say, “I’m going to upload these photos to iCloud.” It is held in the hands of third parties. You know there’s a saying, “it’s not a cloud; it’s just someone else’s computer,” right?
You assume you take some kind of risk when you do it: that it can be scanned, that it can be hacked, anything. While moving it down to the device – even though it only applies to photos in the cloud at the moment – I think it’s very different and penetrates a more private space that until now we could take for granted to keep it that way. So I think it’s a really big conceptual change.
It is not only a conceptual change in people’s thinking about this, but also from a legal point of view. There is a big difference between information you give to third parties and you run the risk of them turning around and reporting to the police, compared to what you have in the privacy of your home or portfolio or whatever.
I see it as a big change.
Jen King: I would like to add that the disagreement here is that Apple had just released the “ask apps not to follow” feature that already existed, but they actually made that dialog box visible to ask you when you used the app if you want the app to follow you. It seems a little dissonant that they just introduced this feature, and then suddenly we have this thing that looks almost more invasive on the phone.
However, I would say that as someone who has been researching privacy in mobile for almost a decade, it is to some extent that these phones are not ours, especially when third party apps download your data, which has been a feature of this ecosystem for some time. This is a paradigm shift. But maybe it’s a paradigm shift in the sense that we had phone areas that we thought had more restrictions, and now they are less than before.
The illusion that you have been able to manage your phone data has only been a delusion to most people for some time.
The idea of having a local phone with a network stack that then talks to the server and goes back – it’s almost a 1990s concept of connected devices, right? In 2021, everyone in your house will always be talking to the Internet, and the line between client and server will be very blurred to the point where we market networks. We market 5G networks not only for speed, but also for performance, whether true or not.
But this inaccuracy between the client and the server and network means that the consumer may expect privacy on the local storage in terms of cloud storage, but I wonder if this actually the line we crossed – or if only because Apple announced this feature, now we spot it there should be in line.
RP: That’s a great point because a lot of people respond like “If the election goes in the wrong direction, I’m moving to Canada” saying “I’m just going to abandon Apple devices and switch to Android instead.” But Android devices are basically just a local version of your Google cloud. it’s better.
And at least you can bracket Android [although] I don’t want to use the bracketed version of Android that I downloaded sideways from some natural place. But we’re talking about the possibility that people just may not understand the different ways in which the different architectures of their phones work.
I have said before that people’s rights, privacy and people’s freedom of expression should not depend on the consumer choice they made at some point in the past. It shouldn’t be path dependent for the rest of the time as to whether the information on their phone is really theirs or really in the cloud.
But you’re right that when the boundary becomes more blurred, it’s harder to deduce these things at arm’s length, and it’s also harder for average people to understand and make choices accordingly.
JK: Privacy should not be the choice of the market. I think it’s for the most part a market failure for the entire industry. Many of the assumptions we had on the Internet in the early 2000s were that privacy could be a competitive value. And we see a few companies competing for it. DuckDuckGo comes to mind in search, for example. But the end result is that many aspects of privacy should not be left to the market.
Full transcription coming soon.