Developers, like most builders, are creative critical thinkers who take pride in their work. Let’s focus on the word “builder” for a moment. During the industrial revolution, we saw a shift in manufacturing where time-consuming processes were made more efficient through automation. With that, we also saw the concept of an assembly line and interchangeable parts transform businesses. The idea was to build as quickly as possible for less cost. Transpose this to software engineering and we see a similar trend: Building software as quickly as possible, using components, and decreasing costs. Implicit in this is the direct correlation between quality of the components and the quality of the final product. This begs the question: Why then are developers selecting poor or insecure components to build their applications? I would argue that the intention to build stable and secure software has always existed, but there is a general lack of awareness and overall confusion on the best approach. We need only look at the latest headlines and read about Fortune 500 companies that have been victims of vulnerabilities despite their best efforts to ship software they thought to be secure. So, how does intent go beyond a mere idea and put into design and practice to mitigate these concerns in the most comprehensive and reliable way possible?
Before we are able to answer that, it is important that we consider a few facts:
Cloud adoption has made it easier for developers to be empowered to not only build their application, but also provision its supporting infrastructure. Take for example a fully-managed CI/CD pipeline on AWS comprised of AWS CodeCommit, AWS CodePipeline, and AWS CodeBuild with container deployments to AWS Fargate. If you find yourself in a similar scenario or aspire to migrate to AWS to use these services, then which tools do you use, and how do you leverage those correctly to ensure that you are building secure applications? If you are using open source components, then how do you ensure that you are using the right versions of components, or find out where those are being downloaded from? These questions extend to your container images as well. Container images are often opaque in that they typically contain various layers, but it is not immediately clear what security vulnerabilities may be contained within each of those respectively. Are you including inspection of these into your automated workflows?
One of the more prominent blockers to applying security is the perception that doing so will undoubtedly negatively impact time to market. Developers are often under time constraints and are focused on building applications and releasing features as expeditiously as possible. This coupled with the complexity of modern architectures, use of external components and lack of prescriptive guidance on leveraging the right tools at the appropriate stage of the development life cycle leads to exacerbating frustration and the expected reaction is one of avoidance. In other words, we acknowledge the problem, vaguely understand there may be a way or ways to resolve it, but are not clear on how to accomplish that and determine it’s not worth the effort today, after all, there’s always tomorrow.
The truth is that this need not be as daunting as it may seem on the surface. The journey begins with understanding your process and gaining insights into your environment. If you don’t know where your vulnerabilities exist today, then how can you effectively solve them? Second, it’s about applying security at every stage of the process. There are several tools that address specific concerns and were built for specific audiences: Security teams, AppSec teams, Dev teams. Use them accordingly. For example, there is a place for static analysis (SAST), software composition analysis (SCA), dynamic analysis (DAST) testing and monitoring tools designed for finding security defects and completing the feedback loop. It’s critical to understand that you may build a secure application today, but can you quickly iterate and resolve for those vulnerabilities that have yet to be discovered before they negatively impact your business or your customers? These are considerations that are necessary for any business to survive in today’s competitive landscape. Sure, you need to ship features as quickly as possible, but you need to do so without compromising security.
This is where solutions such as those available today from Veracode are integral for any business. Veracode is a full spectrum application security testing solution that begins with Veracode Greenlight in the developers’ IDE and spans across the devevlopment lifecycle with Veracode Manual Penetration Testing. Along the way, you are covered throughout the entire software development life cycle. From the moment developers begin writing code and pushing commits, Veracode Software Composition Analysis (SCA) identifies any open source vulnerability and provides crisp remediation guidance. Integrate Veracode Static Analysis (SAST) into your build and test tools and processes to quickly identify security flaws in your code. Lastly, Veracode Dynamic Analysis (DAST) in your release, deployment and operations process reduces your risk of a breach once your application goes live. These are easily integrated with AWS CodePipeline and CodeBuild to secure your fully managed CI/CD pipelines running in the AWS cloud.
As the complexity of modern applications continues to increase over the years, so too does introducing security into every stage of your development life cycle become a necessity. We live in a highly competitive world with a voracious appetite for innovation. It is critical for businesses to deliver quickly and satisfy customer demand, but equally critical to ensure and preserve customer trust. It is possible to do both without compromising one for the other, and the solutions exist today.
Learn more at AWS re:Inforce this month in Boston – Veracode will be at Booth 813, and speaking on Wednesday the 26th on “Integrating AppSec Into Your DevSecOps on AWS.”