In the early 1960s, cars were unsafe. And the car industry’s attitude was: cars are just unsafe, and that’s the risk you take.

But then the public started calling attention to the issue (with some help from Ralph Nader), refusing to simply accept that risk, and things started changing. Regulations emerged, car manufacturers started building security in, and we now have seatbelts, airbags, and dramatically less deaths from car accidents. In fact, the fatality rate per vehicle mile of travel dropped by an astounding 81 percent from 1960 to 2012.

The prevalent attitude about software security today is similar to the 1960s’ car-safety attitude – a defeatist approach that software is inherently unsafe, we can’t make it any safer, and we just have to live with that risk. But as with car safety in the 60s, we should fight this defeatist attitude about software, and work to make a change. Safer cars didn’t emerge overnight, and safer applications won’t either.

Granted, it’s not a perfect analogy, and making software secure is harder than making cars secure in a lot of ways, but simply accepting the status quo is not acceptable. There are ways to make code more secure, and we can make incremental progress. But success hinges on several entities playing a role. You can do everything possible to make a car safe, but the user needs to be smart as well. Here are some incremental steps different roles should take in the pursuit of safer software:

Enterprises

Want more secure applications? Start with your developers. Get them trained on secure coding – with eLearning, instructor-led training or a combination. You wouldn’t expect a structural engineer to begin work on a bridge without understanding how to reduce the risk of failure; we need to start thinking about software engineering tasks the same way.

Beyond developer training, consider:

  • Not setting the bar too high: An overly ambitious application security policy will thwart developer adoption. Start with the most critical vulnerabilities – then work to expand slowly over time.
  • Getting developers the right tools: Look for an application security solution that enables security to be implemented early in the development process and in an automated way.

Educational institutions

Speaking of educating developers, let’s move that initiative back even earlier. A recent study found that none of the top 10 computer science or engineering programs require cybersecurity courses for students to get their degree – and that three of those schools offer no cybersecurity courses at all.

Universities need to offer cybersecurity courses, and not separately from the computer science curriculum. Secure coding is becoming an increasingly critical skill, and it needs to be part of how young developers are learning to code.

Government

This is not really the government’s problem to solve, and trying to impose regulations on software development is not feasible, or desirable, but the government does have a role to play. In fact, it has started to play a role in cybersecurity, and with some success.

I see the government’s application security role as information collection and sharing, specifically the proactive and targeted sharing of information about vulnerabilities and breaches.

For instance, the FBI recently noticed that many healthcare organizations were being attacked through a vulnerability in JBoss and sent out an alert recommending that they check for the vulnerability and patch if necessary.

The government can also use its buying power to hold its vendors accountable. Why should tax dollars go to software vendors that choose to use third-party components with known vulnerabilities? If the government holds its vendors accountable for more secure software, then small businesses and consumers who don’t have this leverage will benefit once the software is improved.

Insurance companies

Cyberinsurance is a controversial topic, but I actually have faith that the industry will mature to the point where we can set a bar for minimum controls that need to be in place to protect an organization’s data. We won’t ever reach a point where we can hold organizations responsible for eliminating all risk, but I think we can expect organizations to put measures in place to reduce that risk to the best of their ability. What would these measures look like? How about at least eliminating easy-to-remediate and easy-to-exploit vulnerabilities that even a few teenagers could take advantage of?

The recent major TalkTalk breach resulted in the theft of names, addresses, birthdays and financial information for potentially all of the company’s 4 million customers, and cost TalkTalk £60m and the loss of 95,000 customers. This destruction was caused by a SQL injection vulnerability, and several teenagers are the major suspects. This is the kind of unacceptable risk that we should be insured against. I’m not suggesting the NSA shouldn’t be able to break into your systems, but let’s at least set the bar at keeping out a 16-year-old with a tool downloaded from the Internet.

Just as cars shifted our security thinking, our old cybersecurity paradigms aren’t going to work in today’s digital landscape. Get details on what application security looks like in this landscape in our new guide, From Ad Hoc to Advanced Application Security: Your Path to a Mature AppSec Program.

Chris Wysopal, co-founder and CTO of Veracode, is recognized as an expert and a well-known speaker in the information security field. He has given keynotes at computer security events and has testified on Capitol Hill on the subjects of government computer security and how vulnerabilities are discovered in software. His opinions on Internet security are highly sought after and most major print and media outlets have featured stories on Mr. Wysopal and his work. At Veracode, Mr. Wysopal is responsible for the security analysis capabilities of Veracode technology.

Love to learn about Application Security?

Get all the latest news, tips and articles delivered right to your inbox.

 

 

 

contact menu