Back in the 1960s, there was a sea change in the way people talked about the age old problem of poverty. In writings by the anthropologist Oscar Lewis and others, the notion of a “culture of poverty” emerged - the idea that poverty isn’t merely the condition of lacking resources, but a destructive kind of subculture that can be self perpetuating. I wonder if something similar could be said about software insecurity.
The thought came to me during a conversation with two noted security researchers: Billy Rios and Terry McCorkle of Spearpoint Security Services. The two men are among the top experts in the security of industrial control and SCADA systems. Their current project, however, is looking at software security issues with medical devices.
Following an approach that has proven effective Rios and McCorkle spent the past few months buying second hand medical devices on eBay and then analyzing them in home laboratories. Rios says his home office is filled with funky, used hospital equipment and that his wife is ready to show him and his gear to the curb.
In the process, Rios and McCorkle started coming across some familiar names. They quickly realized that “most of same vendors in the ICS world are also medical device and software vendors.” That includes multinationals like Phillips, Siemens and Rockwell. Could the same software problems that the two men had discovered in these vendors ICS products also lurk in their medical device products? You bet!
“We literally took the fuzzer we developed for ICS software and launched it against this medical device and 10 minutes into it we found a heap overflow,” Rios told me.
The same development shops that give us balky Human Machine Interface (HMI) and Engineering Workstation (EWS) products are also churning out software to manage life saving medical devices in hospitals around the world - software that suffers from many of the same security ills.
“Its not only bad programming and software practices - all the heap overflows and directory traversals that we see in ICS systems,” McCorkle said. The bigger issue is shoddy design practices. “The industry just hasn’t evolved,” he said.
Some of the problems the team discovered are simple security flaws that are patchable, Rios said. But often the problems are bigger - and won’t be easily solved.
“So, in some cases we can say ‘you guys are just bad at security implementation and your coding practices aren’t good.’ They can get their arms around that, but there’s just so much software that it will take time.” In the other case, however, its not clear that any amount of quality assurance and code auditing will help. “It’s just from the ground up,” Rios said. “The foundation is so poor that you’re going to have to start over.”
The two men say the problem - like poverty - is complex, but it boils down - at least in part - to an industry and corporate culture in which security just doesn’t matter.
“Its one thing if a guy spends a year looking for a bug in your product and writes a custom fuzzer and the research is so unique you get to present it at some prestigious conference,” Rios said. ”Its another thing to take a 3 line PERL script we wrote to own industrial control systems that just looks for ports and throws garbage at them and find a bug in five minutes.”
“It’s a matter of correctness,” he said. “Its not built correctly. It’s not robust. It’s a big problem for these guys. They’ve got to get their arms around it.”
“I’m looking on these (medical device) web sites and there’s nothing on them about being secure, secure design, security - nothing,” McCorkle said. Just a bad: there’s little effort to educate customers about how to securely deploy the products in their environment. “The vendors aren’t going out and saying ‘oh, you need to have a routine maintenance cycle for these products,’ such as firmware updates or patching, McCorkle said.
Finally, researchers who find critical bugs in these medical devices will be hard put to find anyone within the manufacturer to address the issue - there’s no outreach to the security community, nor even easy to find contact information to report security problems with hardware and software. “I have these issues but I have no idea who to go to. I guess I could call customer service...?” McCorkle said.
That kind of laissez faire attitude might have been forgiveable in the 1990s, when the tools, methods and techniques available to exploit weak software were much less advanced. In 2013, however, its a recipe for disaster, the researchers say.
“You’re talking about five minutes to find a heap overflow, a couple hours to write an exploit for it. You have a Metasploit module ready to go in a day,” Rios said. ““If you’re behind in today’s world, you’re going to get clobbered.”
So what’s the solution? The War on Poverty was a multi-front effort to use federal money and power to combat poverty. True, it wasn’t one in which the good guys got to declare victory, but it did greatly reduce poverty in the U.S. through programs like Medicare, Medicaid, Head Start and VISTA. What would a similar effort to combat insecure software in the medical field (as well as in critical infrastructure)? Many of these firms need to rethink the way they produce software from the ground up: bringing security front and center, rather than sweeping it under the rug. In many cases, the two researchers say, software will need to be totally rewritten to make it secure. Beyond that, firms that make medical device software need to reach out to the community of security researchers: soliciting their feedback and tapping their expertise during design, development and testing, and giving them a place to go when they find problems.
As with poverty: solutions will only come once people look at the problem straight on and stop pretending that it doesn’t exist - or wish for it to go away. Hopefully, work like that of Rios and McCorkle in the coming months and years will force some of these firms to do just that.