The world of industrial control systems has been an island unto itself -but no more. The question now is whether the environment can adapt before real damage is done.
Two weeks ago, I had the privilege to attend The S4 Conference, one of the world’s premiere gatherings of experts in the security and integrity of industrial control and SCADA (supervisory control and data acquisition) systems. This is the technology that runs everything from assembly lines to natural gas pipelines to nuclear power plants. I had Dodos on the brain the whole time.
Why Dodos? Well, it strikes me that many organizations that operate industrial control system (ICS) and SCADA software are now in the unenviable position that the flightless Dodos found themselves in back in the 16th century: adapted to thrive in a particular kind of environment – an environment that is experiencing a sudden and unexpected change.
In the case of the Dodo, the change in question was the arrival of human beings – specifically Dutch Sailors in the 1590s. Dodos were well adapted to life on Mauritius, where it had no natural predators, but plenty of nuts, seeds and bulbs that it could consume with the help of a large and strong, crooked beak.
Slow, fat, flightless and (by some accounts) pretty tasty, the Dodos were easy prey for the sailors who arrived on Mauritius – many fresh of long sea voyages where food – and especially meat – was in short supply. The last sighting of a Dodo was in 1688 and they were almost certainly extinct by the turn of the 18th century.
Something like the arrival of the Dutch sailors has been happening in the world of industrial control systems over the last 10 years. Instead of sailors, however, the invaders are malicious hackers as well as independent security researchers wearing the white (or at least grey) hat.
By 2001, computer security experts were warning publicly of methods for compromising industrial control systems. In 2007, researchers at Idaho National Lab demonstrated how a cyber attack that exploits a ubiquitous vulnerability in the electrical grid, dubbed “aurora,” which could be used to cause physical damage to electrical generators. The video of a generator shaking and then beginning to smoke before stopping made it onto CNN. (http://youtu.be/fJyWngDco3g) The Stuxnet worm was discovered in 2010 becoming the first, modern weaponized ICS attack.
Behind the scenes, researchers in great numbers were crossing the land bridge from Windows systems and TCP/IP networks to industrial control products like programmable logic controllers (or PLCs) and Human Machine Interface (HMI). Researchers like Dylan Beresford started to make headlines by exploiting gaping holes in products by Siemens and other vendors – trivial buffer overflows, hidden ‘backdoor’ accounts, weak authentication.
Given the increasing sophistication of protections on Windows and the expertise needed to attack it, industrial control systems were like those fat, flightless birds: easy prey. Soon enough, even point-and-click hacking tools like Metasploit started adding modules specifically to test the security of PLCs and other industrial control equipment.
Conferences like S4 sprung up with the goal of exposing hard-core control systems engineers to the best and brightest in the security researcher community. In recent years, researchers like Billy Rios, Terry McCorkle and Eireann Leverett demonstrated how malicious actors could easily identify vulnerable control systems and compromise them, often from the public Internet.
But the industrial control sector has been slow to adapt to these changes and recent years have seen only modest changes. ICS vendors like Siemens stood up CERT teams to manage inbound reports of vulnerabilities from the research community. Many vendors are also beefing up protections in newer products, but a long tail of older and already deployed ICS products remain vulnerable. Many are engineered in such a way that updates or patching are impossible.
Dale Peterson, the CEO of Digitalbond, an ICS security consultancy that hosts the S4 Conference, told me that reports of vulnerabilities and exploitable holes in ICS and SCADA systems have dominated the agenda at S4 in recent years, but are so commonplace that he chose not to make them a fixture of the S4 agenda this year. The critical infrastructure community, he worries, was becoming inured to the warnings.
At this year’s conference, researcher Perry Pederson talked about the reluctance of many utilities to address glaring and well-document vulnerabilities. Among those left almost entirely unaddressed: Aurora, the vulnerability demonstrated by INL in 2007. The industry – including regulators like the Nuclear Regulatory Commission (NRC) – have evolved to prevent system failures, but haven’t yet developed the skills needed to prevent system compromises with the same efficiency. “When something fails, you know it,” Pederson told attendees. Compromises, however, are designed to hide from the operator – or even deceive her about the internal state of the equipment she’s monitoring.
What’s to be done? Ralph Langner, the world-famous Stuxnet expert, said that critical infrastructure providers shouldn’t fret about script kiddies or other casual hackers – what he refers to as “hooligans.”
They need not go down the rabbit hole chasing every reported vulnerability in SCADA and ICS products, either. Rather, ICS vendors and critical infrastructure providers should focus on the subtle connections between logical and physical networks in their facilities and watch out for what he called “cyber physical” vulnerabilities – those vulnerabilities that, like Aurora, can be used to degrade a facility or cause physical harm to a system or operating environment.
At the end of the day, industrial control systems aren’t the same as enterprise networks even if they are vulnerable to the same kinds of attacks. Saving them means putting them out of harms way, while also allowing them to do what they were made to do.