Skip to main content
/jul 25, 2019

What Appsec Training Gets Wrong

By Jasmine Webb

Instances of vulnerabilities in applications have been stagnant over the past few years. The same categories come up year over year in similar rates across the industry. Why doesn’t it seem like progress is being made in this area? Are we at peak security with all software engineers writing the most secure code they can? I sure hope not.

Despite all the hype of DevSecOps, which surveys say is effective in some aspects, programmers still lag behind in bug prevention. Either developers aren’t being trained to write secure code, or their training isn’t working.

Many AppSec training programs try to teach programmers to think like hackers. Generally, that means knowing how to look at an application and see how one might exploit it — but there’s one more thing about hackers that programmers could learn from.

The Hacker’s Paranoia

Facebook's Zuckerberg

In 2016, Mark Zuckerberg was photographed with tape over his webcam.

Those Macbooks have little indicator LEDs that are impossible to turn off, so why bother with the tape? Since then, researchers have figured out how to disable Macbook LEDs. Maybe Zuck knew something we didn’t, but more likely he knew not to take any advice that sounded like “____ is impossible to hack."

How hackers actually think

Hackers are a paranoid bunch. At security meetups, there are always a few people who refuse to connect to the wifi. They’re like that because they either know exactly how unsafe that is, or they know that they *don’t* know it’s safe. Security professionals are acutely aware of how flimsy the digital world is.

The opposite of the hacker’s paranoia is the layperson’s optimism. When you don’t know that most security is a disaster, or when you think that certain products are un-hackable, it seems reasonable to take a relaxed attitude towards security.

“I’m just a regular guy so I can re-use passwords. Hackers won’t bother to go after me.”

The Dunning-Kruger effect in tech

Developers who have little to no experience with security are sometimes even more optimistic than laypeople. This is probably because of the Dunning-Kruger effect. Hackers are much less likely to assume systems are safe–experts know the limits of their knowledge.

How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments

From Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments by Justin KRUGER and David DUNNING

Developers, on the other hand, can be more confident because we’re not security experts, but we think we are. I once received the advice from another developer that “if you use React, you don’t have to worry about cross-site scripting.” That’s not always true, but it’s easy to see how someone might believe that. One React tutorial for beginners says that “React is safe. We are not generating HTML strings so XSS protection is the default.”

How not to teach AppSec

Application devs should know about common pitfalls with the technologies they work with. Most security training nowadays (“think like a hacker!”) involves workshops teaching broad concepts on a yearly basis. Best practice advice is often oversimplified and hand waved. This gives programmers a false sense that they know even more than they do and would never fall into common traps.

We already think we won’t write bad code. We think we’re security experts. Contrived and oversimplified examples like “Don’t send SQL directly to the database, just use the ORM and you’ll be safe” reinforces the notion that other people write vulnerable applications. After all, if I’m using Django and I essentially have to use the ORM, I’m definitely never going to have to worry about SQL injection.

And yet, these same vulnerabilities: XSS, SQLi, bad access controls, etc, pop up year after year with no sign of decreasing.

It isn’t hard to write insecure code in new technologies. How you could accidentally write insecure code varies from language to framework. As an example, training shouldn’t be teaching NodeJS developers to worry about low-level buffer overflows before it teaches them about finding and handling insecure dependencies.

Some trainings include frameworks like Metasploit. These are cool and fun, but not very useful for application developers. You don’t need to be a pentester to write secure code. Scanning for vulnerabilities is less useful than understanding how they happen.

A better approach is to know language and platform-specific pitfalls. Application security is a huge field and it doesn’t make sense to try teaching all of it in a one-day training.

Security awareness is key. What to worry about, how much, and when can only be picked up little by little through exposure.

Effective AppSec training

At the end of the day, companies should be responsible for training developers on secure coding practices for their particular stack.

To instill a healthy security awareness, AppSec training for developers needs to be ongoing, not yearly. It should also be language-specific as often as possible.

At many companies, juniors learn from more senior engineers through code reviews and mentoring. Some companies have security departments that can also make themselves available to talk about good AppSec practices with security office hours, further supplementing self-paced training.

Hackers are careful because they have security awareness developed over years of exposure. Developers can get that exposure too, but not from lectures and slides on broad and abstract categories of vulnerabilities. Knowing what to worry about, how much, and when, is hard-learned over multiple exposures — not in the span of a single two-hour workshop.

If you’re interested in AppSec training for developers, check out Security Labs, which offers hands-on training with practical real-world examples and interactive scenarios to better prepare developers. 

Related Posts

By Jasmine Webb

Jasmine Webb is a developer and application security researcher. She came to Veracode from Hunter2 and now works on Security Labs researching the prominence of common vulnerabilities and creating proof of concept applications. Jasmine is passionate about promoting secure coding practices.