UPDATE December 2022
Ever since Apple announced plans to expand child protection in August 2021, the reaction has been swift and fierce. The criticism has nothing to do with protecting children – that we can all agree is a good thing. Instead, the reaction has everything to do with how Apple plans to do it: they’re going to scan the images on your phone and start reporting it to the authorities. Here’s why that should matter to you.
Be sure to subscribe to the All Things Secured YouTube channel!
Apple’s Attempt to “Save the Children” Backfires
How are we supposed to decide how we feel about this news of Apple photo scanning on our personal phones?
We’re talking about sensitive stuff like child exploitation, which I’m sure we can all agree is a bad thing. But we’re also talking about a possible corporate invasion of our privacy, which is also alarming. So it’s really hard to formulate an unbiased opinion. On the one hand you have the Electronic Frontier foundation claiming that Apple is creating a backdoor and on the other you have Apple’s software chief saying:
In no way is this a backdoor.Craig Federighi, Apple Senior VP
So here’s a quick recap of what’s going on, and after this I’ll share a crazy personal story that sums up my thoughts.
In early August, Apple announced a child protection feature that will roll out with its next OS update. The aim is to take photos uploaded to iCloud and match them against a database of known CSAM images (CSAM = child sexual abuse material) for the purpose of protect young people. In other words, really bad stuff. If there’s a match of at least 30 images, the case is automatically flagged for human review and then reported to authorities.
This kind of scanning is already being done on platforms like Google and Facebook, and you might be shocked by the number of reported cases for each platform just last year. I have two boys of my own, and it makes me sick to my stomach to think of a child being abused in this kind of way.
The reason most privacy advocates and Apple critics are crying foul is because, for the first time, this scanning starts on your device, not just when you upload it somewhere. And that’s the rub.
The Logic Behind Why This Might be Illegal
The logic goes like this:
I have investment properties in the US that I rent out to generate income. As a landlord, the person whose name is on the deed of the house, I do not have the right, at least in the United States, to enter that property anytime I want. I need to request permission from my renters or else I could be sued for trespassing.
I’m borrowing from Dr. Neal Krawetz who gave this analogy on the Hacker Factor Blog. Apple owns the operating system you’re using on your phone right now, but the issue on the table is whether or not they have a right to suddenly update that operating system to begin searching the contents of your phone.
And contrary to what you might read online right now, this is not a black and white issue. Apple claims that they’re building privacy into the process in a number of ways.
- First, they’re breaking up the scanning process by having your device do half of the work and the cloud do the other half so that there’s a protection between the two.
- Second, the database they’re matching against is actually two, independently run databases, not government affiliated.
- Third, they’re forcing a manual, human review before any kind of action is taken.
For the most part, this seems like a pretty solid solution. I can’t really think of a better way to do it and of all the articles and criticisms I’ve read so far, none have offered a better solution.
For me, there are just a couple problems I see that are worth pointing out with Apple’s communication safety features.
Potential Problems with this Scanning Technology
First, even though the scan only happens for those photos that are uploaded to iCloud, it’s a bit unfair because that’s a default setting on your phone. In other words, unless you manually turn it off, your photos are probably syncing to iCloud even if you weren’t aware of it and your photos will be matched against this database.
Second, we’ve got a case here where a search is being made on our personal property without warrant or probable cause. It’s easy to justify this for something as blatant and terrible as child exploitation, but once Apple starts to open that door, it’s going to be very hard for them to say no to other, less honorable requests.
What requests could that possibly be, you ask?
Let me tell you my own story here. While I was living in China’s western region of Xinjiang, which is China’s testing ground for all kinds of surveillance and monitoring technology, one of the things I was constantly on the lookout for were these sidewalk checkpoints.
Local police at these checkpoints would stop people and require that their phones be handed over for inspection. At this point, they would plug into the phone and scan for specific photos, files and videos that they deemed dangerous. If anything was found, you were carted off to the police station immediately. And what files were they looking for exactly? Well, nobody knew, except for the Chinese government, of course. It could have been something genuinely worrisome but it also could have been something as simple as a copy of the Quran.
In the absence of true accountability, whether that be the Chinese government or a company like Apple, these kind of back doors into our private devices is alarming.
What if China approaches Apple and tells them that in order to do business in the country, they need to start scanning all devices against a database that they provide? Now that they’ve already developed and deployed the technology to do this, they have very little leverage to say no.
There’s a quote from the Electronic Frontier Foundation in Vox that I really liked here:
A thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor.Electronic Frontier Foundation