My initial reaction to Apple fighting the court order to unlock the terrorist’s iPhone was “Good for Apple. Doing anything that could reduce my privacy is bad!” And lots of the breathless news reports continued to make me think that way with all the talk of backdoors and such.
But then I started reading articles in more technical press and I see that the issue is a lot more nuanced (and way more nuanced than Donald Trump’s absurd call for a boycott – didn’t he learn when people called for boycotts of his enterprises?).
It seems that the FBI isn’t asking Apple to unlock the phone, but to give it the tools by which the FBI can unlock the phone (and only this phone). They want Apple to make a special version of iOS tailored to that phone only that drops the policy of erasing the phone after 10 failed PIN entries and they want this new iOS to accept PIN entries quickly rather than enforcing significant delays after failed PIN entries. This would allow the FBI to unlock the iPhone quickly by trying the 10,000 possible PIN combinations several per second (via a debugging interface).
So once I learned that, I thought that it wouldn’t be the end of the world if Apple complied, because this custom version of iOS would only work on that phone and, if a bad guy got ahold of it and changed it to work on another phone, the digital signature wouldn’t match, so it couldn’t be loaded on the other phone. So Apple wasn’t creating a terrible security hole that could be exploited by governments, corporations, criminals, etc.
And then what I think is the real problem hit me: If Apple shows they can (help) unlock this iPhone, then they will be inundated by court orders to unlock iPhones. Every time someone disappears, the police will ask a court to force Apple to unlock their phone to see if they’d been messaging someone suspicious. If grandma dies, her relatives will want her phone unlocked in case there were photos or other information they want. And you know there will be judges that will grant such orders. So if Apple goes along this once with this seemingly reasonable case, they (and Samsung and Google and everyone else) will have to do it every day. The floodgates will open and the court orders will never end. And that’s not the business Apple or any of them want to be in.
So how do we solve this? Maybe there are compelling reasons that a phone needs to be unlocked, and this case is as good an example as any. But I don’t think a court order is the answer. As Apple says, we need a public discussion, and we probably need new laws that define exactly when unauthorized unlocking is appropriate (and I mean “unauthorized” because the unlocking is being done without permission of the original setter of the passcode). That could avoid the stream of arbitrary court orders. If the lawful reasons for unlocking are broad enough, then Apple can just set up a side business that unlocks iPhones for $10,000 or $50,000. That would deter common street criminals from using it, but if the information on the phone is that important, then it’s totally worth it.