With the New Year unfolding, 'tis the season to be reminded that app security has not yet arrived at the optimal state. Consider this piece from Kaspersky's Threatpost pointing out how re-used third-party libraries perpetuate security holes long after they have been discovered.
For 2017, the industry needs a change in approach. AppSec is certainly getting better, but enterprise security teams can't do this alone. Developers must play a far more active role in securing the applications they build. And when IT helps developers create high-quality secure code, all players benefit through far better security.
Developers whether they are on the company payroll or working for an independent coding firm, need to be held far more accountable for producing secure code. There needs to be a wide range of incentives for ISVs—and in-house coders—for doing aggressive security testing.
To start with, sharing security resources so that developers can easily access code that is already known to be problematic is a great start, as is proving coders with tools to proactively search for holes before submitting an app.
It's one thing to be using well-known code that is deemed safe and have a hole discovered months after it's been used. That is bad, but excusable. New vulnerabilities are found all the time and that is why it is important to have an inventory of all the software components development teams use. That way updates can be made when a vulnerability is disclosed. However, it's quite another thing—and this is what the Threatpost piece details—to use components with known holes, holes that would have been discovered with a half-hour of web searching.
Developers need to know that they share responsibility for any code they generate. For far too long, developers have adopted the "let 'em find it" school of security. Coders: Your job is to perform full-fledged due diligence on your scripts before you send them to your client, whether that client is your boss (corporate coder) or your client. And that includes ensuring the components you use are secure.
This creates a system where all players in the creation landscape have a strong incentive to find holes and to never use code that is known to have been breached. This is great news for end-user companies, which end up with more secure offerings.
This is also fine news for developers as coding shops with robust security checking operations are more likely to get hired. And this is terrible news for cyberthieves, for whom I offer no sympathy.
And if you need help in making sure your apps are secure—and maybe getting some of those fines from your developers—we could certainly lend a hand.