Much has been written about Apple's official stance against giving law enforcement an encryption backdoors into its customers' files. And Apple's firm position against a backdoor has been painted as a marketing decision, as it gives people a really good reason to buy Apple devices instead of Android or something else.

On top of that reality is the argument that a backdoor isn't even in law enforcement's best long-term interests, as the backdoor that lets them in would also do the same for cyberthieves and cyberterrorists.

Even though all of that is true and valid, there is a less commonly-articulated—and far more compelling—pragmatic argument why Apple really can't deliver a backdoor. (This is to calm those security people who think that if Apple ever chose to secretly provide a government backdoor, they would then say publicly exactly what they've been saying.)

There's a wonderfully nerdy piece in LawfareBlog about this and you really should read it in full. Here, though, are the highlights. Apple created something they called the Cloud Key Vault (CKV) for password backups, but they married it to a hardened device called the Hardware Security Module (HSM). That's what foiled the feds. If ten incorrect guesses are made, the relevant decryption keys are permanently destroyed.

"In Apple’s scheme, there is no third-party with access—all keys are stored either in an entirely inaccessible hardware device or in the mind of the user. There are private keys involved in initially programming the HSM, which Apple destroys before they come in contact with user secrets. In other words, Apple doesn’t even trust itself with decryption keys, and has gone out of its way to make it physically impossible even for anyone who works for Apple to access them. That’s why this system (assuming you trust Apple’s initial programming, their HSMs, etc.) could be very secure," the story said. "Rather than providing an exceptional access solution, Apple took the radical step of destroying those keys in order to have an acceptable level of protection. And it took these extreme measures for a mere optional backup system for a subset of Apple users’ personal information."

In short, Apple didn't craft a system that delivered consumer-level adequate security. No, it went the route that all proper security does. The developer needs to device a system that even the developer—if someone has a gun to his head and is offering the developer $80 billion—couldn't get into.

Almost all consumer-level systems are designed to appear secure, but if someone knows what to do—and the developers of that system are presumably in that group—it's breakable. For example, a good home security system should have enough redundancies that not even someone who intimately knew that system could break in without being caught.

By the way, there's a reason consumer systems always leave their designers a way to get the safeguards. That reason is that consumers, against their own self-interests, insist on it and expect it. If consumers can't remember their passwords, they don't expect their financial system to be permanently bricked. They want the vendor to be able to break in.

The fact that Apple crafted a system that its own people couldn't break into says far more than their public statements do.

About Evan Schuman

Evan Schuman has covered IT issues for a lot longer than he'll ever admit. The founding editor of retail technology site StorefrontBacktalk, he's been a columnist for, RetailWeek, Computerworld and eWeek and his byline has appeared in titles ranging from BusinessWeek, VentureBeat and Fortune to The New York Times, USA Today, Reuters, The Philadelphia Inquirer, The Baltimore Sun, The Detroit News and The Atlanta Journal-Constitution. 

Comments (0)

Please Post Your Comments & Reviews

Your email address will not be published. Required fields are marked *

Love to learn about Application Security?

Get all the latest news, tips and articles delivered right to your inbox.