President Obama held his nose when he signed the National Defense Authorization Act of 2013(NDAA) earlier this month. There was, after all, a lot to dislike in the $633 billion bill, which funds the U.S. Department of Defense. Among other things, it continued support of the Guantanamo Detention Center and its green lighting of military detention of U.S. citizens suspected of being “enemy combatants” on U.S. soil. So offensive were these provisions, that the President used a withering signing statement when he signed the bill to call them out.

As with most complicated legislation, however, there is also something to like in the NDAA - especially on the issue of cyber security. Specifically: the 2013 NDAA brings a sea change in the DoD’s approach to security - and one that could begin to dismantle a culture of lax application security that has been the norm in the federal government for much of the last three decades.

The section worth reading here is 933, which establishes a “baseline software assurance policy” for the DoD. Specifically, Section 933 authorizes the Under Secretary of Defense for Acquisition, Technology, and Logistics, along with the DoD’s CIO, to “develop and implement a baseline software assurance policy for the entire lifecycle of covered systems.” That includes “appropriate automated vulnerability analysis tools in computer software code” during the “entire lifecycle of a covered system.” In other words: testing during development, deployment and maintenance for DoD software. Software vendors who want a piece of the DoD’s massive IT budget will also have to adhere to the same standard and show the DoD that they also have a system for identifying and remediating vulnerabilities in their products.

The change this represents can’t be overstated. For years, respected authorities on computer security, like Alan Paller at The SANS Institute, urged the U.S. government and military to tackle the fundamental issue of code quality. Uncle Sam, Paller said, should use his massive purchasing power to force private sector firms like Microsoft and Oracle to produce more secure products. But little came of it. Instead, the DoD focused its energies and dollars on what it knew best: defending against external enemies. The 2012 edition of the NDAA, for example, makes no mention of software assurance, while throwing dollars at problems like advanced threat research and that old D.C. saw: ‘improved information sharing’ between DoD and the Department of Homeland Security.

In the meantime “advanced persistent threats” (aka Chinese hackers ) ran roughshod over government networks - often using vulnerabilities in ubiquitous software like Adobe Reader and Internet Explorer to get a foothold on government- and defense systems, then moving laterally from low-value to high value assets. And, while some vendors (notably Microsoft) have made great strides towards improving the overall security of their products, many others have not - or have paid lip service to secure coding, while continuing to do business as usual.

Section 933 is all about putting an end to the nuisance of APT - not with a magic bullet, but by draining the swamp of insecure and vulnerable software that nurtures attacks.

Will it work? It’s hard to know. The DoD budget has more than doubled since President Clinton sat in office, but so has waste. A GAO report in 2009 found almost $300 billion in cost overruns in a review of 96 major defense acquisition projects. And the “cyber” components of the bill add up to just a dozen or so pages in a 681 page piece of legislation. And, as we’ve discussed on this blog: security vulnerabilities lurk everywhere - from shoddy application code, to third party code of “unknown origin” to third party applications and systems that are part of your software supply chain.

Still, there is room for hope. Among the requirements of the new NDAA is the creation of a report to Congress that will make recommendations on how to improve software assurance. Part of that is the development of ideas about “how the Department might hold contractors liable for software defects and vulnerabilities” in their products. Assuming Congress acts on some or all aspects of that report, the cost of doing ‘business as usual’ in the software world may get much higher.

About Paul Roberts

Paul Roberts is an experienced technology writer and editor that has spent the last decade covering hacking, cyber threats, and information technology security, including senior positions as a writer, editor and industry analyst. His work has appeared on NPR’s Marketplace Tech Report, The Boston Globe,, Fortune Small Business, as well as ZDNet, Computerworld, InfoWorld, eWeek, CIO , CSO and He was, yes, a guest on The Oprah Show — but that’s a long story. You can follow Paul on Twitter here or visit his website The Security Ledger.

Comments (2)

Maggi Pier | February 14, 2013 1:54 pm

Increased security in all applications is critical. I agree that there are many vulnerabilities with the big software giants. Even our company at has to make sure that the applications we use are secure. it doesn't take much in this era of cyber crime to take a company down.

New Digital World | June 13, 2013 9:27 pm

Getting real about security would be all our top priority. It is what you can't see that brings the greatest threats.

Please Post Your Comments & Reviews

Your email address will not be published. Required fields are marked *

Love to learn about Application Security?

Get all the latest news, tips and articles delivered right to your inbox.