How to Help Developers Accept and Embrace Security Testing

Jim Jastrzebski By Jim Jastrzebski
February 14, 2017

In previous posts in this blog series, I've explained that AppSec teams should have empathy for developers as they go through the stages of grief after an unfavorable security assessment of their code. In this post, we wrap up by discussing how to get developers to move through the final two stages – from bargaining to acceptance.

Bargaining: "We have a firewall that handles this."

The bargaining stage of grief is almost where we want to be. Bargaining is a negotiation where "good enough" security may well be the goal. It's true that there's no such thing as perfectly secure software, but all software should be at least "secure enough" for the situation.

The risk in stopping at the bargaining stage is that developers are still focusing on doing only what is absolutely required, not necessarily what is right. Risk reduction is the reality of application security, but developers must understand that it's essential to reduce risk to a deliberately determined acceptable level.

Acceptable risk is not defined by which code changes are straightforward and convenient to make. Clear standards, policies and expectations of "good enough" security are a must. Otherwise, developers may be tempted to deem elusive, vague or hard-to-remediate issues as acceptable, or sweep them under the carpet with claims of sophisticated compensating controls that may or may not actually provide the necessary protection.

I've worked with many developers who invested many hours of time documenting why some issues did not require a code change. Although the code changes themselves are not always difficult to make, as an AppSec manager you should understand the reasons developers might make these excuses, such as:

- Developers may lack the skillset to make the changes necessary to fix the issue

- Changes may be awkward and inelegant compensations for an underlying design issue

- The administrative process required to make these changes could be so heavyweight that developers would prefer anything other than enduring it

- Some security findings may already be protected against, either in the code itself but in ways security assessments may be unable to appreciate, or in the runtime environment

An important caution in this area is that very often the developers will be absolutely right. Applications known to run with reduced privileges, for example, will not be allowed to access restricted areas of the file system by the operating system itself. So long as this is consistently enforced, it may be "good enough."

Self-awareness of the organization's tolerance for risk, and any application's configuration parameters and how these are assured, is the responsibility of the security teams who consider the developer's proposed mitigations and determine whether such proposals are acceptable or not. The acceptability of compensating controls requires considerable scrutiny and rigor, and it is just as common for security teams to push back on developers as it is for them to be over-permissive and accept risks that are not sufficiently addressed.

Getting to Acceptance

Security is hard. So try to understand that from a developer's perspective. But you should help developers understand that security is not a stage of development that comes after implementation – it is an intrinsic aspect of the implementation process.

Inadequate application security is too often identified in testing or, worse still, in production. Security needs to be considered in the requirements and design stages of development, too – whether you follow the classical Waterfall, Agile or DevOps development process.

You should also consider, if you make security the exclusive problem of the developer, it's reasonable for the developer to resist this sudden and unexpected additional responsibility. Any organization that wishes to take application security seriously needs to recognize everyone has a responsibility for making security a priority.

Developers are too often blamed for failing to retrofit security into a product that did not include security considerations from the outset. Organizations who take security seriously must provide clear guidance, standards and policies for their developers around what risk tolerances constitute "good enough" security.

Because security is hard, developers need to be given the training and processes to facilitate implementation of secure applications. Otherwise, developers will, rightly one might argue, resist responsibilities not shared by the rest of the organization.

What Acceptance Looks Like

When you provide clear direction to developers with empathy, the response you'll get is, "Thanks for this information. I know what to do and why, and I'm ready to get going!"

This is the state of mind of developers who buy into the purpose of security testing, the goals of the security program, and the value of assessment results. They'll have an understanding of why the results matter, what risk they pose, and in what order to address them.

Acceptance comes with time, and it comes more quickly and easily with iterations through the entire process. Regular security testing and remediation cycles, and working toward achievably higher standards over time, instills confidence as developers synthesize new skills into an effortless competence in which security no longer needs to be a conscious and deliberate effort, because secure programming practices become habit.

This is where we want to end up. Even as the security landscape changes, developers can become fluent at incorporating new security requirements into their day-to-day activities easily, and without the cognitive dissonance they experience in early stages of the grief process.

Read the first two parts of this blog series: AppSec Managers Should Have Empathy for Developers and A Developer's Stages of Grief After a Failed Security Assessment.

Jim has been an application security practitioner for about 10 years and now manages the Application Security Consulting group at Veracode. He holds a postgraduate degree in computer science from RPI, with a specialization in software engineering. Prior to joining Veracode, Jim developed software for consumer broadband, nuclear power generation SCADA systems, and multimedia content delivery for mobile devices.