Imagine this not-too-unusual scenario: on February 1st, security firms identify a string of sophisticated attacks against prominent firms in the defense industrial base, energy and high tech fields. Within days, subsequent investigation by private firms and the Department of Homeland Security’s US-CERT reveals that the attacks share many elements in common. Most important: they all leverage a previously unknown security hole in a widely used enterprise technology: Microsoft’s Sharepoint collaboration software.
Responding to the findings and citing the risk to customer data and those of U.S. citizens, on February 14th, the DHS orders Microsoft to immediately halt sales of Sharepoint licenses domestically until the security hole has been fixed, proven to work and implemented at all affected Sharepoint installations in the U.S. News reports online dub it a “Blue Valentine” for the Redmond, Washington software giant.
Preposterous, isn’t it? But a very similar scenario to this played out just last month. The only thing: it wasn’t a mission-critical enterprise software platform that was faulty, it was a passenger airplane. Specifically: the FAA grounded all of Boeing’s new 787 “Dreamliners” operated by U.S. carriers until the company was able to demonstrate that the planes are safe to fly.
The issues facing the Dreamliner aren’t huge – some malfunctioning charging batteries may be prone to smoking and *ahem* fire. There are fuel line issues – at least on one plane – and some issues with a braking system. But the FAA’s move was decisive, and other nations including Japan (the Dreamliner’s biggest market, so far) took notice.
I’m not trying to make some grand analogy between software and aircraft – the two are very different. But I do think that the software industry could learn something from the culture of safety that characterizes the commercial airline industry. And that’s what this post is about.
First, it’s worth noting that things were not always that way. Civil aviation’s “safety culture” is a deliberate creation, not some accident of history. In fact, in its early days, aviation greatly resembled the modern software industry: a space populated by inventors and entrepreneurs, with lots of promise, innovation and experimentation. Think it, build it, try it. Repeat.
As with the modern software industry, government support played a significant role in encouraging this. In the early 1900s, for example, The Smithsonian Institution distributed information on the nascent science of aeronautics to the public as part of its scientific mission. Among the curious Americans who studied the materials: Wilbur and Orville Wright.
There was lots of experimentation both before and after the Wright brothers. We’ve all seen those vintage clips of bulky, poorly engineered planes careening down wooden ramps, then collapsing under their own weight, or briefly taking air before they plummeted to earth. Some designs were never destined for success. For decades after the Wright Brothers made their first, successful flight, the aviation industry was something of a Wild West – unregulated, ungoverned. This was the “barnstorming” era, dominated by small, private airplane owner/operators (many of them World War I vets) who traveled the countryside performing in air shows (or “flying circuses”) and selling rides to the public. It was an era characterized by lots of flair and colorful characters, but also lots of accidents and deaths.
By the 1920s, public sentiment was beginning to change. Reports about airplane crashes and fatalities had the public concerned. More important: aviation industry leaders, themselves, were becoming concerned that a public perception of airplanes and air flight as dangerous and unreliable were on the rise. The result: at the height of the pro-business, free-wheeling 1920s, a Republican President and small government conservative, Calvin Coolidge, signed on for a major expansion of federal power with the Air Commerce Act of 1926. That law gave the federal government sweeping powers to ensure the safety of civil aviation.
Under the Act, new rules were created that all but outlawed barnstorming and authorized the Department of Commerce to assess and certify the airworthiness of aircraft, test and license pilots, operate and maintain safe air fields and – importantly – to investigate crashes and other safety incidents. The results were clear: the total number of accidents per 100,000 flight hours in the U.S. has averaged around 0.18 for the last decade – a tiny fraction of what it was during the barnstorming days.
Of course, a plan to empower some government bureaucracy to oversee software quality would almost certainly fail. Unlike building aircraft, building software applications is a task that bends to suit a particular need or demand and might never be performed the exact same way twice.
It’s not wrong, however, to think that software vendors might soon be held to account for product failures in the same way that airline makers were – and are – called to task for reliability issues with their products. The success of the government’s role in the aviation industry didn’t come from telling Boeing how to make its airplanes. It came from the government telling Boeing and others ‘if you build this, and it fails, we will investigate, determine the root cause and, if necessary, hold you responsible for fixing the problem and making your customers whole.’
No such mandate exists today in the software field, even though plenty of government ink has been spilled writing best practices for software development and deployment. The latest, – NIST’s Special Publication 800-53: Security and Privacy Controls for Federal Information Systems and Organizations – is due out in April.
Still, as Veracode has noted, software is still very much in the barnstorming or “reckless innovation” stage. We design, build, q/a and deploy applications to serve a business need. But we rarely consider slowing down that process to make sure that our code, or those of third party vendors who help us along the way, stands up to the highest level of engineering integrity and reliability. As with aviation: the results speak for themselves. Veracode’s State of Software Security report found common security holes like SQL injection and cross site scripting are common both in internal- and third party code, but that fewer than 20% of enterprises request code-level security audits. And, barely a week goes by without news of a critical new software vulnerability that has been discovered or is under attack. Even in the critical infrastructure sector, coding and implementation flaws are common. That, in turn, has created a secondary market for vulnerability and exploit information about ICS and SCADA systems.
The solution, of course, is to foster a culture of security in which the industry sets and maintains high standards, and regulators provide the teeth by raising the cost of not doing security right. Hopefully, that’s an idea that will get off the ground in 2013.