HP released a new tool called Scrawlr yesterday that can be used to identify certain types of SQL Injection vulnerabilities in a website. It was a joint effort with Microsoft and a direct response to the mass SQL Injection attacks of late.
Scrawlr quickly came under fire on the Web Security mailing list for having some pretty major limitations. Billy Hoffman et al have been quick to point out that the tool was designed to address a very specific subset of SQL Injection vulnerability -- the type affected by the mass attacks -- and is not designed to be a general purpose replacement for existing SQL Injection scanners. Let's look at the limitations, as outlined on the HP page, one by one.
Limitation: Will only crawl up to 1500 pages
Depends on what they mean by 1500 pages. For example, if I have these links on my front page, is that one URL or three?
Or, does it mean that it will really only crawl 1500 pages total, so if I have the same link 1500 times on the front page, it won't go any further? Either way, for most smaller websites this is probably fine. If you need more than 1500 you could give it different starting URLs in an attempt to improve coverage. It would be nice to have a clearer definition of what it means to "crawl up to 1500 pages" though.
Limitation: Does not support sites requiring authentication
Well, this will render it useless for the majority of enterprise apps. But there are still a lot of sites out there that don't require authentication, including some of the ones that got hit during the mass attacks, such as the United Nations, UK government, etc.
[Update 06/26: Thomas Ptacek Mike Tracy investigates further and provides a workaround that'll work for the majority of sites that use cookie-based auth]
Limitation: Does not perform Blind SQL injection
They have taken a lot of flack for this but Billy describes it as a conscious choice:
An early version of the tool checked for blind SQL injection, but the final verison of Scrawlr did not. ... The biggest feedback we got from early testing was developers wanted to "see" the vulnerability. Differential analysis is kind of difficult to visualize in a way that is helpful for the average dev, and pulling the table names through blind was too much of a performance issue.
I can sort of understand this rationale. Blind SQL Injection testing is much more susceptible to false positives. As users of any commercial web scanner or source code analyzer will attest, the more time you spend chasing down FPs, the less likely you are to put any faith in future results. It'd be nice if there was a way to toggle Blind SQL Injection testing on and off, though (could be off by default so nobody gets confused).
Limitation: Cannot retrieve database contents
Who cares? Find and fix the vulnerability. Pulling down the entire database "because you can" is a total ego move.
Limitation: Will not test forms for SQL Injection (POST Parameters)
This is probably the toughest one to swallow. It's not that difficult to parse out forms from HTML, and form POSTs can represent a major chunk of the attack surface. Granted, the Chinese tool associated with the mass attacks did operate solely on GET requests (i.e. parameters in the query string) so HP can defend this again by saying the tool is really aimed at the sites being targeted by the mass attacks. I think it's a little short-sighted though; chances are that the mass attacks will evolve and it's better to be proactive about it than reactive.
It's tough to bash someone for releasing a free tool. I personally think HP should add an option for enabling Blind SQL Injection testing, and that they should consider supporting POSTs as well as GETs. You're basically getting a (massively) stripped-down WebInspect for free, so take it for what it is. No single tool is a panacea.
The jury is still out on how effective Scrawlr is against the things it does claim support for. Keep watching the Web Security list; the reviews are filtering in.