A maker of home automation systems says its products are insecure by default because that’s how “geeks” want them. Cool – or cop out?
I found myself in the middle of an interesting dispute this week. On the one hand was a security company of good repute, Trustwave, whose researchers had analyzed a slew of smart home appliances and home automation systems ahead of a scheduled talk next week at the Black Hat Briefings.
In the last year or so that I’ve been a member of Veracode’s Customer Success team, I’ve found that I have been hearing the same remarks from an array of organizations- “We must implement Secure Coding practices in order to retain a positive brand image, but we’re up against very strict deadlines and need to get our code out fast!” As we work with Security and Development teams alike, this statement starts a discussion which typically unravels until we get to a question that is asked again and again…
Back when I testified with the L0pht to the Senate in 1998 we suggested the government use incentives as a way to get businesses to improve their security. The Senate was Republican controlled at the time and even us political newbies knew that regulation was going to be a non-starter at the time. We also proposed that the government use its purchasing power to require the vendors it buys from to have good security.
It doesn’t matter which threat report you consult, the fact remains that the application layer is the most targeted and most vulnerable point of entry into an enterprise. The smart enterprises are taking security into their own hands by forcing their software suppliers to prove that they are taking appropriate measures to secure the software accessing the enterprise’s critical systems and data.
The US Food and Drug Administration wants to help security researchers push medical device makers to proactively address flaws in their hardware and software. With the impending internet of things on the horizon this is a proactive move to nip a looming problem in the bud.
Are application security vulnerabilities as big a threat as the Year 2000 (Y2K) date calculation flaw that drove the world to distraction? You bet! But don’t count on seeing a “Y2K” for application security holes any time soon.
Today we bring you another installment of our ongoing series “Application Security Education Spotlight”. In this edition we speak with Dr. Mike Whitman of the Kennesaw State University. Dr. Whitman is the Program Coordinator for the BBA-Information Security and Assurance and teaches Information Security courses at the graduate and undergraduate level. He is also the Director of the KSU Center for Information Security Education.
DEF CON founder Jeff Moss’s plea to U.S. government and law enforcement to give his annual hacker con a pass this is quaint – but hopeless (and far too late).
Interpreting development specs is challenging enough, but writing code without analyzing the regulatory business ramifications rarely ends well. Typically the process assumes your business leaders and partners understand the regulatory environment, the impact to application functionality, and that they can successfully articulate the requirements to the development team. But often the message is muddled, because of a lack of understanding. So put on your red cape (blue tights optional), and read some compliance strategies to take your career to the next level.
On June 13th the U.S. Food and Drug Administration issued a cybersecurity advisory statement addressing the need for increased focus on security in medical devices and hospital networks. The statement is no surprise as it follows a more than a year of mounting pressuring and increasing evidence that the health-care sector is among the most vulnerable to hackers. Not only are they vulnerable but the data that typical medical networks contain is highly sensitive, Chris Wysopal outlined this in a recent interview with Fox News.