Perceived security threats motivate IT people the same way they do everyone else. People react to how much a threat scares them, which sometimes has little relation to how truly threatening that threat is.

Consider rank-and-file U.S. citizens and fears of terrorism. The potential damage by a terrorist is horrendous, but there are consumers who consider terrorist a far bigger threat then burglars, even though the chance of being victimized in a terror attack is miniscule compared to that person's chance of being burglarized. In short, because the devastation threatened by terrorists is so horrific, it blinds people to the statistical probabilities.

What brought this to mind was an interesting piece last week about attempts to limit how much corporate and private information government investigators can access for law enforcement investigations. Corporate IT executives—along with their CFO, COO and CEO bosses—are far more willing to jump into action securing data from government agents than from, let's say, garden-variety cyberthieves. That is true even though, statistically, it's the cyberthief who is a much more likely attacker. It's also true that cyberthieves never bother to play by the rules, abide by court orders or submit to any oversight.

And yet, enterprises that have tolerated security holes and lax procedures for years are suddenly jumping to attention when they hear of government investigative efforts. A story this week in The Washington Post talked about mobile information companies that are opting to jettison data rather than save it—true heresy in technology startup circles—all because of fears of government probes demanding data access.

Personally, I see this all as very good news. That's because I want companies to take data security—and all security—as seriously as possible. And if the bogeyman fears of federal law enforcement trying to access customer records is what it takes it push them into action, so be it.

Once all of the hysteria dies down and pragmatic voices become dominant, it will become clear that neither of these approaches—deleting petabytes of data so it can't be subpoenaed nor setting up massive encryption that the government can't break—is sustainable, practical or even that desirable.

Silicon Valley is fueled by data and there are simply too many dollars to be made from data mining to surrender it to law enforcement. Also, data mining analytics are constantly improving so even data that can't be effectively monetized today will also certainly be extremely valuable in the near future.

The other effort, the one aimed at limiting the data the government can examine, also is ultimately pointless. In the same way that an exterminator must be able to go where the mice can go (for you trivia buffs, that's a great line stolen from the pilot episode of Mission Impossible), investigators must be able to examine any place terrorists might communicate. Mark six areas as off-limits to investigators and that is precisely where the bad guys will go every time. Hence, selecting and announcing where investigators can't probe isn't realistic.

Also, the idea of making encryption so secure that the government can't crack it is also unrealistic. In the famed U.S. Justice Department effort to force Apple to help the FBI break Apple's own encryption, Justice never saw Apple as the only way to get to the desired content. Apple was, instead, the most cost-effective route.

The NSA has the best code-breakers in the world and they could have easily cracked Apple's consumer-grade encryption. What Justice wanted was a fast and low-cost way to do this so it could be applied to tens of thousands of cases—at will.

A logical meeting-of-the-minds would be let government investigators pursue whatever they want to pursue—bad news for global bad guys—but to impose serious hurdles for getting access to that data, which should be good news to corporations wanting to protect their customers. Well, at least to protect their customers enough to block those customers from suing your corporation. (In the boardroom, altruism only goes so far.)

As I've noted before, court orders to companies demanding encryption access must be a last resort. The government must exhaust any and all less-intrusive means (hello, NSA) before it puts these kinds of demands on data companies. Secondly, the demands would have to only be used for serious national-security efforts, or you'll have speeding tickets prosecuted by accessing a consumer's geolocation data history. ("If you don't speed, Mr. Smith, how did you drive from San Jose to Los Angeles in two hours last Tuesday?")

And that law-enforcement agency must have a strict ceiling on the number of such requests they can make, which will force them to self-regulate a bit on choosing which cases justify this effort.

In time, this will all work itself out. In the meantime, though, if the threat of government snooping gets C-levels to take data-protection seriously, I'm happy.

About Evan Schuman

Evan Schuman has covered IT issues for a lot longer than he'll ever admit. The founding editor of retail technology site StorefrontBacktalk, he's been a columnist for, RetailWeek, Computerworld and eWeek and his byline has appeared in titles ranging from BusinessWeek, VentureBeat and Fortune to The New York Times, USA Today, Reuters, The Philadelphia Inquirer, The Baltimore Sun, The Detroit News and The Atlanta Journal-Constitution. 

Comments (0)

Please Post Your Comments & Reviews

Your email address will not be published. Required fields are marked *

Love to learn about Application Security?

Get all the latest news, tips and articles delivered right to your inbox.