Discovering vulnerabilities is an essential part of effective security testing; companies pay good money for services to accomplish this goal with rigor and precision. Many enterprises now offer "bug bounties" to encourage white-hat hackers to deliberately penetrate systems and then report the results. One such independent security researcher is Chris Roberts, a cybersecurity consultant who's getting airtime because he's apparently hacked the entertainment, navigation and flight systems of several airplanes. His work brings up an important question: When does creating proof of concept for security threats cross the line to become just another form of attack?
According to CNN, Roberts was recently detained in Syracuse, New York, after the FBI was alerted to tweets he made indicating that he'd hacked his airplane while in flight. During FBI interviews, the security researcher said he'd hacked into plane control systems "15 to 20 times from 2011 to 2014." Roberts says he knows of vulnerabilities in three kinds of Boeing Aircraft and one Airbus that allowed him access to each plane's "nerve center."
His method? Using a modified Ethernet cable to connect his laptop to an under-seat electronic box linked to a plane's entertainment system, he was able to access more critical systems such as the flight controls. In one instance, he issued a climb, or "CLB" command, to one of the plane's engines, causing it to move laterally without any pilot input. The FBI states thumb drives in Roberts's possession contained "nasty" malware capable of compromising computer networks, but a statement from Boeing says there's no way Roberts's hacking got off the ground since in-flight entertainment systems and more critical controls are completely isolated. Wherever the truth lies, however, it's clear that at least some airplanes come with worrisome vulnerabilities.
Roberts's findings have raised the ire of the FBI, Boeing and other aviation experts. He says his only goal is to improve aircraft security, and he argues that the FBI has taken five years of his work, compressed it down into several high-risk incidents and taken much of it out of context. But according to Don Bailey in a Dark Reading article, good intentions aren't enough — the way Roberts went about his proof of concept leaves much to be desired.
Why? Because agencies like the Department of Defense and the Federal Aviation Administration have known for years that there are problems with aircraft systems; in fact, United Airlines recently created a new bug bounty program to pay out air miles in exchange for finding vulnerabilities. Bailey argues that taking control of an airplane to prove a point, even if the point is that it can be done and there are imminent security threats, undermines the fundamental force of cybersecurity research: mitigating risk. By putting passengers and crew members at risk of spinning out of control or crashing into the ocean, tactics like this increase overall risk and lower public confidence; the goal of white-hat hackers should be the exact opposite.
So now Roberts finds himself fighting a war on several fronts, while the vulnerabilities he supposedly discovered remain unresolved. Is there a better way?
Here's the ultimate goal: to identify security threats as quickly as possible, allowing companies to mitigate minor issues rather than remediating large threats. With software increasingly distributed across local, cloud and mobile platforms, it's easy for companies to lose track of code or rely on less-than-secure strings from third-party developers or open-source apps. The result is a spotty security landscape that makes it easy for threats to slip through and start causing real trouble.
Sure, a proof of concept that breaks your server or ransoms your data proves a point, but it doesn't do you any favors. It's better to catch the problem before it goes live. Doing so doesn't require high-flying security antics, but it does rely on building in security from the ground up — making security a part of the development process rather than an afterthought.
As noted in the Dark Reading piece, the cybersecurity industry is facing the problem of a "rock star" thought model, where grabbing attention is more important than reporting vulnerabilities through white-hat programs or uncovering bugs through rigorous testing. Hopefully missteps such as Roberts's push the industry toward a more collegiate model, but in the meantime, companies must do everything they can to avoid this kind of showboating. Ensuring security from soup to nuts is the first step — catching more in development to lower the risk of high-profile problems.
Photo Source: StockSnap