A co-worker passed along this snapshot taken at the Karsten Nohl, Jake Appelbaum, and Dino Dai Zovi talk at HOPE this past weekend. The context, of course, is that the overzealous Debian developer who accidentally crippled OpenSSL back in 2006 said he did so because valgrind reported uninitialized memory use. Click through for the full-size version.

So automated software review is dangerous now? Perhaps that bullet should read "modifying code you don't understand is dangerous."

About Chris Eng

Chris Eng, vice president of research, is responsible for integrating security expertise into Veracode’s technology. In addition to helping define and prioritize the security feature set of the Veracode service, he consults frequently with customers to discuss and advance their application security initiatives. With over 15 years of experience in application security, Chris brings a wealth of practical expertise to Veracode.

Comments (4)

OWASP | July 21, 2008 9:42 pm

But automation *is* dangerous. People are *relying* on these tools because they've been sold as a cure. But the fact is that they're blind to the majority of application security problems. They can't find most authentication, access control, encryption, or business logic vulnerabilities. And what they do find is riddled with false alarms that take a huge amount of time to diagnose. And even if they do find a real issue, they can't evaluate the business impact of the problem.

* Working on the wrong stuff first is dangerous.
* Wasting critical application security resources is dangerous.
* Operating with a false sense of security is dangerous.

Still, just because a chainsaw is dangerous doesn't mean it's useless. Automated application security tools can be helpful to an experienced security expert. They may be able to provide some data as input to a security review.

CEng | July 21, 2008 10:50 pm

@OWASP (do you really speak for the entire OWASP?):

Most automated tools don't claim to detect authentication, access control, encryption, or business logic vulnerabilities. I don't think any of them claim to be a complete solution.

A human can certainly find more complicated problems, but he isn't going to find all 500 XSS vulnerabilities. He's going to find 5 representative examples before moving on to more interesting attack vectors, and guess what? The developer is only going to fix the 5 examples he was given.

Automated analysis and manual analysis are complementary. Nobody should expect automation to be 100% accurate -- it's a hard problem to solve. But even with imperfections, it's still valuable. Manual testing has different imperfections, not the least of which is that results quality is dependent on the skill of the tester.

The old adage still holds true -- "A fool with a tool is still a fool."

OWASP | July 22, 2008 8:43 pm

Just pointing people to a good reference. Glad to hear you confirm that you don't find these critical security areas. Many tool vendors aren't very realistic about what their tools can't do.

I was probably thrown off by your PCI compliance document which says, "Additionally, using automated dynamic and binary analysis, Veracode looks for the OWASP top ten, the NIST list as well as many other business best practices"

This seems to be the exception for you all, but there are a huge number of vendors who seem hell bent on overstating their abilities, claiming to find all violations of the OWASP Top Ten. Claiming all 695 issues in CWE might be a record though.

cwysopal | July 23, 2008 10:39 am

Veracode recommends that high and very high assurance applications, the majority of online apps that handle PII and financial data, also get a manual review in addition to an automated analysis.

You can see this on page 2 of our software rating system datasheet:

Please Post Your Comments & Reviews

Your email address will not be published. Required fields are marked *

Love to learn about Application Security?

Get all the latest news, tips and articles delivered right to your inbox.