, , , , ,

IMG_0381     As you’ve likely heard by now, the FBI has been able to break into the iphone used by the shooters in the San Bernardino Decemember 2015 attack on county health workers. But its done so without Apple’s help after all. That in and of itself is likely an interesting story, as has been the public debate as to whether Apple or the FBI (or none of the above…) holds the higher moral ground in the debate.

     Not discussed much in the media (with the exception of the Register (UK)), is precisely what was being asked of Apple and how it relates to the particular iPhone in question. Back in February, Trevor Pott penned a thoughtful, interesting article zeroing in on exactly who was asking for what and why that matters. And, in Mr. Pott’s opinion, how Apple is/was wrong.

     First, though, Mr. Pott explained that he fully supports individual privacy. And like many of us, he questions whether government agencies can be trusted to follow the law, uphold civil liberties and, importantly, avoid “mission creep.”

     That said, though, Mr. Pott still believes that Apple is wrong. For one, he thinks that Apple mischaracterized the issue by claiming that the FBI was seeking a “backdoor.” As Mr. Pott explains it, in the realm of encryption, a backdoor would be either a key escrow system or a “master key” system that allows easy access for law enforcement into any Apple product encrypted with the back-doored system. The request for ‘custom code’ that would allow the FBI to perform a brute-force attack against the iPhone without triggering the “10 strikes and the phone is wiped” protection is “a completely different animal” according to Mr. Pott.

As Mr. Pott sees it, the true issue revolves around a design flaw:


There is no “backdoor” involved here. What appears to be involved is a design flaw. Something about the iPhone 5C in question is broken. Either it is possible to load a compromised firmware into the phone despite the fact that the phone is locked, or it is possible to read the data off the flash chips and attack it in a VM until the password is brute-forced.

Either way, it would appear to your correspondent that Apple screwed up when designing this device and it left open a means of attack. The judge is asking Apple to use its expertise to exploit this flaw. It’s as simple as that.


Not discussed in the  FBI/Apple debate is a potential silver lining:

As far as I [Mr. Pott] am concerned, the judge is in the right here. Apple is not being ordered to create a flaw and distribute it to all devices. It [Apple] is not being prevented from fixing this flaw in future devices. It is being asked to exploit a flaw that currently exists, and for the privacy-conscious this is actually a good thing.

Mr. Pott’s article leaves us with a great deal of food for thought (and of course, we like how he mentions ‘the little guy’…):

It’s time to see some market forces benefiting the little guy for a change. Competition amongst device makers should be building us unassailable devices. Competition amongst lawmakers should be building compromises that meet society’s many needs without setting us up for a repeat of history’s worst mistakes. I wonder if either group is up to the challenge.