One of the biggest security threats is that enterprise mobile app testing is overwhelmingly focused on functionality and not security. Pen testing of apps to see what data they—or some third-party app it is integrated with—are actually retaining is hardly ever done prior to deployment, if then. Why?
It's simply not in the mindset of line-of-business managers. They want/need the apps to perform certain functions, all of which are directly or indirectly tied into a revenue stream. The people who do focus on security—typically the CISO's team—are rarely brought in because they are a cost. How can this self-destructive pattern be fixed? Appdev talent must insist on aggressively doing such testing as a standard part of the gig.
There's only one problem with that solution: Typically, app developers are just as apathetic about security as their LOB bosses/clients. To fix that requires that we address it at a much earlier stage. In colleges and high schools across the country, security testing must be trained as an essential part of coding. No app developer would ever write a piece of code without then testing it, right? Security testing must become just as second-nature as functionality testing is.
The easier path would be to get LOB managers to universally insist on it and be willing to pay for it, but I am afraid to report that that simply won't happen. The sad reality is that those LOB managers—and their C-Level bosses—absolutely will end up paying for the lack of security testing, but that eventual pain and the initial negligence are separated by too much time to be connected. Heck, with corporate turnover what it is today, the people who will endure the punishment of a massive security/privacy hole may not even be the same people who oversaw the app's development.
This sobering education approach was touched on recently in an interesting piece in CSO Online. The column argued that academia is where such training and attitude adjustments must happen. The piece "suggested some type of gamification which was intended to be an innovative marketing campaign but they keep getting approached by enterprises to customize the game for them. It's fun and offers different ways to educate developers on secure coding. There is a set of five questions that show snippets of vulnerable code. You find the vulnerability. You can play with friends, and it is completely free."
This is a good technique, but I think we first must conquer the current lack of a perceived need to do this at all. This group testing idea has two flaws in it, though. First, data-retention is never going to be spotted by its user-victims. The user types a password in and gains access. How is that user to know that the password is being retained, let alone being retained in clear text?
The second flaw is that looking for security holes by examining the app's code is far too limited. Many of the pen-testing-uncovered data-retention security/privacy holes I have worked with are what should be considered interact holes. The app might work fine on its own, but the problems kick in when it interacts with a third-party app (such as crash-detection software that gave Starbucks so many headaches), other mobile apps or even the mobile OS itself.
That's why it is essential to create a security sandbox that can examine the app in a realistic replication of how it will actually be used and the many other bags of code that it will have to work with.
Back to education. Something that is normally a great thing about university training is that it's based on reality, with many of the professors and lecturers coming in with with extensive realworld experience. But this is not helpful when we're trying to change how companies look at app security. Companies need to do outreach to universities. Today, that also means outreach to specialty schools, high schools and even middle schools. We have to get to developers as early as possible.
Security has been a corporate afterthought for far too long. To fix it, we have to go back to the classroom—literally—and rethink how programming is taught.