After nearly 10 years as a security consultant, I've talked to thousands of developers about remediating security flaws in their code. It's not always an easy conversation, and developers have a wide range of emotional reactions, not all of them good.
The fact is, developers are increasingly responsible for quality assurance and security testing of their code, tasks that didn’t used to be part of their job descriptions. Software development is undergoing a massive shift with the rise of DevOps, and it means big changes for developers.
All change, even positive change, entails loss. Experiencing grief over what is being lost – old habits and routines, established practices and skillsets that have worked perfectly well until this point – is a perfectly natural reaction.
With security remediation, we are not only asking developers to change their code. Developers may experience a changing sense of their own competence, which affects how they see themselves professionally or even personally. This is why empathy is in the best interest of organizations seeking to improve their security posture.
As developers come to terms with the results of a security assessment, they might go through a process similar to the Kübler-Ross stages of grief. The stages of grief (Denial, Anger, Bargaining, Sadness and Acceptance) are most often used to describe the emotional reaction to death of a loved one, but they are a normal psychological response to just about any loss.
As a brief refresher to Psychology 101, the stages of grief are as follows:
1. Denial. The disbelief in reality and preference to believe that things are as they were before.
2. Anger. Resentment, hostility and intent on assigning blame for the change.
3. Bargaining. Seeking compromise to reduce the magnitude of the change.
4. Sadness. Loss of motivation, focus, clarity and direction. Hopelessness.
5. Acceptance. Regaining hope and direction, and a motivation to move forward.
Being presented with a security assessment report is not a traumatic life event, but you should remember that developers may react as if they were experiencing a loss – maybe as a curtailment of their independence, or a forced reconsideration of their own skill.
Below, I'll explore how developers might react, and walk through some ideas of how to respond to developer reactions at each stage of the grief process.
In my conversations with developers, I've found that denial is a common response to a security assessment, especially in organizations where the AppSec program is not yet well-established.
Developer reaction: A developer may demand proof that findings are exploitable, and even blatantly exploitable flaws may be dismissed as unlikely to happen, or an acceptable consequence of some requirement or expectation of how the application is to be used.
AppSec response: Many security defects may not be blatantly or even easily exploitable – and while this may be a reason to address them with less than critical urgency, such findings should not be dismissed out of hand. By the time security issues become provably exploitable it is likely too late to prevent them from being exploited.
Developer reaction: I once had a frustrated developer tell me that the findings were not actionable because "this code has been in production for 15 years, and we've never been hacked."
AppSec response: Just because a vulnerability hasn't been exploited doesn't mean it won't be. Likewise, just because you've never been in a car crash doesn't mean you can go without a seatbelt!
People cannot be forced out of denial and pushing too hard can lead to embarrassment and lingering resentment. My experience has taught me that the productive path forward is to provide simple examples that demonstrate the concepts underlying the identified issues.
Everyone gets angry sometimes, but in the process of grief, anger is usually about displacement and assigning responsibility for the problem to someone else. Developers faced with an unfavorable security assessment may be prone to explaining away why identified security issues are not within their sphere of responsibility.
Developer reaction: Developers may feel justified in blaming others for bad code in a defective or insecure component, and arguing the responsibility rests with the author of the defective library, not with the developer whose code depends on that library.
Or a developer might argue that rare and unexpected misuse of an application – after all, hackers do not abide by intended use when attacking our systems – cannot be addressed at the coding stage.
In some cases, developers may question the very effort of security testing as busywork, or as some executive's pet project, and thereby disown responsibility for ensuring the security of the software they develop. This is again a normal response to the stress posed by the assessment and it is a temporary state of mind.
AppSec response: It's important to provide developers with compassionate but clear guidance about how the security assessment process connects to organizational goals, so that it is not seen as arbitrary. Explain why dependencies and other authors' products impact the security of their own product, and that it is exactly how the application responds to unexpected usage and data that is vital to securing an application.
In the next blog post, I'll discuss the bargaining stage and how to help developers move into the final stage – acceptance – to understand why application security testing is fundamental to doing their jobs well, a goal security shares.
Read the first two parts of this blog series: AppSec Managers Should Have Empathy for Developers and How to Help Developers Accept and Embrace Security Testing