A large-scale deployment of the Veracode static code analysis platform across a large enterprise presents a number of unique challenges, such as understanding your application estate, prioritising your applications for scanning, and communicating with your application owners. This blog post provides some guidance based on my experience at delivering several hundred scanned applications in a 14-month time frame.
Understanding Your Application Estate
The first challenge is to understand the nature of your application estate – where are the applications hosted, where are the codebases, who is responsible for building and maintaining them, what development languages are used, how critical are they to the organisation, and so on. Most enterprise organisations will maintain an asset inventory of some sort; you should immediately familiarise yourself with this and determine the extent of information recorded, and what export formats are available. In my experience, two problems exist: data accuracy and completeness. In many instances, the contact details of application owners were incorrect or missing entirely. In our application repository, the programming languages or frameworks were not recorded, and in only a few instances was the source code repository location specified. After my initial attempts to use the application repository as the principle data source, I realised I would need to augment this with my own data gathered during an initial application profiling phase.
Application Profiling and Assigning Criticality
My initial attempts at profiling applications used a crude MS Word questionnaire containing questions such as technical contacts (capable of building the binaries), application language and frameworks, source code repository, binary image size, application version, and continuous integration environment. This questionnaire was sent to the registered application owners and the responses were then entered manually into a tracking spreadsheet based on an export from the application repository. It soon became apparent that this method was cumbersome and time consuming, so I deployed a web form based version of the profiling questionnaire which captured the responses to a backing spreadsheet enabling easy import to the main spreadsheet. Reviewing the responses, it became apparent that not all applications would be suitable for static code analysis due to factors such as host operating system or language incompatibility. Once those applications were eliminated, it was necessary to prioritise the list in order to ensure that our licence usage was targeted at the most critical business applications.
In order to ensure you are focused on the most critical applications, consider a number of indicators; does the application require an application penetration test, is the application externally facing, does the application have any particular regulatory requirements, has the application been the subject of recent incidents? For example, the Monetary Authority of Singapore (MAS) guidelines mandate a code review process that may be fulfilled in part by static code analysis, so I used the MAS compliance as an immediate inclusion criterion. It is important that whatever selection criteria you employ, you are able to justify this both in terms of the Veracode licence usage and the manpower of the application teams who will be required to perform the upload, scan, and review.
Communication with Application Owners and Teams
Armed with your list of applications, you will now need to gain a mandate from senior management within the application delivery sector of your organisation that supports your programme and encourages the participation of application teams using a “carrot and stick” message; for instance, they need to comply with MAS legislation. Veracode will help vendors with this compliance. It is important that this message come from the upper management of the application teams and that the message stress the value of the programme rather than coming as an edict from within the upper echelons of the security part of the organisation. Our programme failed to achieve an initial foothold due to our lack of a clear mandate, which enabled recalcitrant application teams an easy opt-out clause. In many cases, application teams were easily convinced of the value of early flaw detection and engaged with the programme quite willingly; however, in a number of cases, no amount of convincing or persuading could convince them to participate. One of the most frequent objections encountered was the perceived workload in onboarding to Veracode. It is important that you make the process of account creation as efficient as possible, and that you have the relevant support in place in terms of documentation, knowledgebases, support emails, etc. Many teams were pleasantly surprised at the ease of the process, and it was apparent that this news propagated within the development communities as we saw reduced friction as the programme progressed.
In order to ensure that our programme was not constrained by the team resources, I automated the process of user account and application profile creation on the Veracode platform by leveraging the rich APIs available. The application spreadsheet was used as the data source and a Microsoft Visual Studio Tools for Office (VSTO) plugin was developed, which provided an additional toolbar within Excel (for details, see Managing Flaw Review With a Large Multi-Vendor Application). This plugin allowed for the creation, modification or deletion of accounts on the platform based on the underlying spreadsheet data. Although I invested a significant upfront effort in developing the tooling, I reaped the benefits later in the programme when I was able to completely onboard up to 100 applications in one day. Additionally, I was able to add metadata specific to our organisation (business alignment, investment strategy, software vendor) to the application profiles on the platform, which greatly enriched the reports generated within the platform’s analytics engine. Within a few weeks of the programme starting, it became apparent that teams were often asking the same questions, so I started capturing these questions and their answers as a set of Frequently Asked Questions available on our internal social media-like platform. Through appropriate tagging and hyperlinking, I quickly developed an organisation-specific knowledgebase, which again lowered the barrier to entry for application teams who no longer had to wait for an answer or struggle with a problem. During the midpoint of our programme, I identified a few obvious success stories (applications that had performed a number of scans and were showing a clear improvement in security posture), and I asked the teams working on those applications to contribute their experience to our social media platform in order to encourage the participation of other teams.
This brief blog post highlighted some of the challenges facing a new Veracode static code analysis deployment along with some solutions that I have come across in the process. I hope some of the approaches and solutions I described will ensure that you are soon well underway with analysis. In the process, you will not only find strengths in your team but flaws during review –this is the subject of my next blog post, Managing Flaw Review With a Large Multi-Vendor Application.
Read my earlier blog post: How to Run a Successful Proof of Concept for an Application Security Programme.
For more details on the development of my application security progamme, see Ad Hoc to Advanced Application Security: Your Path to a Mature AppSec Program.