Creating a new software application is like baking the perfect pie: Every company has its own recipe that includes "secret" in-house code but uses common, third-party ingredients where applicable. But what happens if ingredients in your latest batch are bad? CA Veracode's software composition analysis service recently determined that external components embed an average of 24 known vulnerabilities into every web application. So how can companies keep their software from poisoning critical systems?
If third-party or open source vendors make the component you're looking for, why waste resources recreating it? This line of reasoning has become industry best practice: According to analysts, 95 percent of all IT organizations will use some type of open source software in mission-critical applications by 2015. And according to Dark Reading, it's not out of the ordinary for more than 90 percent of a given application to comprise a mix of third-party and open source components. It makes sense — developers have produced bug-free, streamlined code that has been used in web applications for years, if not decades, and one of the hallmark benefits offered by readily available cloud computing solutions is the ability to leverage these components to perform common compute functions.
But these external components aren't designed for use in a specific industry or to meet any particular compliance standard, meaning they don't undergo the same kind of rigorous testing as in-house apps. The result is "an average of eight 'very high severity' or 'high severity' vulnerabilities per application caused by open source and third-party components," according to CA Veracode's VP of Enterprise Security Strategy Phil Neray. Ultimately, the struggle between speedy development and sustainable defense puts companies in a tough spot where outsourcing is dangerous, but necessary.
Industry groups like OWASP, PCI and FS-ISAC have taken a hard line against these kinds of vulnerabilities — each requires explicit, established policies and controls to govern the use of all components. But expectations don't always match reality: For many enterprises, the sheer volume of applications and components used on a daily basis makes it impossible to determine where a risky component has been used and what level of risk it presents to corporate and consumer data. Worst-case scenario? Think Heartbleed - followed by a loss of consumer confidence and possible sanctions for negligence.
To address this issue, CA Veracode's software composition analysis integrates with existing binary static analysis (SAST) and dynamic analysis testing (DAST) services in the cloud. This service automatically inventories all third-party and open source components used by all applications in a corporate network and then identifies any publicly known vulnerabilities. In addition, the solution provides version information for each component — when combined with SAST, DAST and vendor application security testing (VAST), companies can both discover the extent of their application risk and take steps to mitigate a breach.
Outsourced code is necessary to avoid IT budget overruns and stay competitive, but companies can't turn a blind eye to public vulnerabilities. As one bad apple spoils the batch, one bad component spoils the app.
Photo Source: Wikimedia Commons