Popular Mechanics recently published an article about the NSA Red Team, which caught my interest, having been a part of that organization for a short stint back in early 2000. The article does a decent job of describing the Red Team's charter, which is essentially to attack DOD targets in an attempt to simulate real adversaries, not unlike a consultant running a pen test against a corporation. The rules of engagement are similar to most pen tests: don't DoS the target, don't install malware, generally be non-destructive. Disappointingly, the author sprinkles the usual super-secret uber-hacker spin throughout the article to make the Red Team seem mysterious and exclusive, with untouchable talent. It's a little misleading. For starters, there's the predictable question about success rates:

I’d heard from one of the Department of Defense clients who had previously worked with the NSA red team that OWNSAVAOG and his team had a success rate of close to 100 percent. “We don’t keep statistics on that,” OWNSAVAOG insisted when I pressed him on an internal measuring stick.

This is one of those statements that is difficult for the average reader to interpret. It's intended to make the team sound like a crack squad of hackers, but in reality it's the same statistic that every security consultancy cites during sales calls. The truth is, there's a lot of wiggle room on what is considered "getting in" to the target. For example, some would say that brute forcing an FTP server and downloading some FOUO (For Official Use Only) documents constitutes penetrating the target. Others would disagree. How about personnel? I thought this was an englightening and accurate statement from the unnamed NSA source:

And like any good geek at a desk talking to a guy with a really cool job, I wondered just where the NSA finds the members of its superhacker squad. “The bulk is military personnel, civilian government employees and a small cadre of contractors,” OWNSAVAOG says. The military guys mainly conduct the ops (the actual breaking and entering stuff), while the civilians and contractors mainly write code to support their endeavors. For those of you looking for a gig in the ultrasecret world of red teaming, this top hacker says the ideal profile is someone with “technical skills, an adversarial mind-set, perseverance and imagination.”

He basically admits that the team consists mostly of people who "run the tools" and only a handful that actually write the tools or do anything cutting-edge. It shouldn't be that surprising; just as in any large consulting organization, you have some people who run scanners/tools and aren't expected to be terribly analytical. While the Red Team almost certainly has some superstars, on the whole it is similar in both skillset and composition to a typical consultancy or enterprise security team. In terms of attracting and retaining top talent, the Red Team faces the same challenges as the rest of the information security industry, with the built-in disadvantage of the government pay scale. If that wasn't bad enough, they also have to compete with themselves (i.e. the rest of the NSA) for already scarce resources. Given these challenges, how could one realistically expect the Red Team to be as advanced as the article portrays? Finally, let's dispel the "super-secret" notion -- unless things have changed significantly, the majority of Red Team operations are unclassified. Granted, detailed information is guarded, but you can find reports summarizing past operations if you dig around a bit. One would expect that an operation intended to be truly secretive would never make its way into Google search results. I want to conclude by saying that this post is not intended to cast the Red Team itself in a negative light. I enjoyed my time there and had the opportunity to work with some smart people. The Red Team's goals are worthy and noble; clearly, state-sponsored cyberterrorism is a growing concern and as a country we should be as prepared as possible. But realize that we have a long way to go.

Veracode Security Solutions
Veracode Security Threat Guides

Chris Eng, vice president of research, is responsible for integrating security expertise into Veracode’s technology. In addition to helping define and prioritize the security feature set of the Veracode service, he consults frequently with customers to discuss and advance their application security initiatives. With over 15 years of experience in application security, Chris brings a wealth of practical expertise to Veracode.

Love to learn about Application Security?

Get all the latest news, tips and articles delivered right to your inbox.

 

 

 

contact menu