/jul 25, 2018

Software Quality Is a Competitive Differentiator

By Maria Loughlin

One of the ironies of DevOps is that while the methodology supports faster and more automated software production, it doesn't boost code quality unless quality is a focus for the software team. As more than a few business leaders have discovered, gaining a competitive edge in the digital economy requires a more concentrated and comprehensive approach.

It's no secret that software code powers our world — it’s in jet engines, automobiles, the electric grid, medical systems, commerce, appliances…just about everything. Yet, producing reliable and secure software has become increasingly difficult. Applications are not only growing in size, they’re also becoming more complex and intertwined across platforms, systems and devices. APIs and the Internet of Things (IoT) are inserting code — and distributing processing — across millions of applications and devices, as well as the cloud.

This complicated environment is forcing business executives, IT leaders and software developers to think and work differently. For example, a growing array of systems and devices rely on artificial intelligence (AI) to drive activity. Automated systems increasingly decide on the course of action based on constantly changing inputs. A system — and the software that runs it — must adapt dynamically.

The upshot? Software quality can no longer be a checkbox item — it must be a framework that spans an organization. Ultimately, an enterprise must own the success of its code — and develop habits that produce high-quality software. This includes understanding how and why code quality is important not only for performance but also for security and final business results. A DevOps initiative can succeed only when an enterprise recognizes the scope of today's software frameworks.

Software Quality Redefined

The digital world is creating intriguing challenges related to software quality. These extend beyond the sheer volume of code that’s required to run systems. For instance, UI/UX has emerged front stage center — particularly as apps have proliferated. Maturing technologies, such as augmented reality and virtual reality, have introduced new challenges. The takeaway? It's no longer acceptable to view UI/UX testing as a traditional, commoditized function — a quality experience is paramount.

There are other challenges, too. As the IoT matures and grows, there's a need for innovation in testing. The variety and number of edge devices is exploding, and all of this introduces enormous QA challenges. Ensuring that software performs adequately and meets user requirements is critical. The need for service level agreements between service providers and consumers has never been more important.

Artificial intelligence changes the testing landscape as well. It can take over some human roles. However, those hoping to replace traditional software testing teams with AI readily admit that largely autonomous applications would still require continuous training to ensure that technological and business goals are met. Simply put, AI will augment rather than replace QA professionals and will create new fields of specialization.

Raising the Quality of Code

All of this is changing the stakes. Yet, many organizations aren't prepared. For example, the rollout of Healthcare.gov. was delayed by months as a result of breakdowns in processes. In the end, the cost of building out the IT framework exceeded the original estimate by three times due to ongoing performance, load and management issues. In the private sector, breach after breach has occurred in recent years.

How can organizations step out of the development morass and transform software development into success stories? These factors make or break an initiative:

  • The need for automation. This encompasses everything from quality controls to scanning code for vulnerabilities. Quality tests and software quality metrics are part of a continuous delivery pipeline — and these benchmarks must be clearly defined across the organization. Investments in quality automation — unit tests, functional tests, and performance, load and system tests — generate long term savings.
  • The need for a modular approach. Organizations that produce smaller and more focused batches of code simplify scanning and testing, and increase overall delivery velocity. It's also easier to identify problems when software is composed of modules and sub-modules. Finally, with these modules in place, an Agile approach becomes far more viable. The enterprise can produce and reconfigure software while maintaining quality.
  • The need to address scope. What needs to be tested and scanned has also changed. As we enter a world where infrastructure is comprised of code, we also need to plan and test the quality of infrastructure creation and configuration scripts. This requires the right internal governance framework and processes as well as the right tools and technologies.
  • The need for continuous feedback. It's critical to fail, then fail fast and move on. A rapidly evolving product can be shaped according to customer feedback, and fast turnaround allows your teams to stamp out defects and hone the software for your audience or customer base. This involves tracking how users interact with a site through blue-green or A-B testing that that analyzes features and new code based on a subset of the user population.

Security Can’t Be an Afterthought

Finally, there's a need to connect security to code quality. Although organizations are embracing DevOps, many aren't addressing the need for secure, high-quality code. Incredibly, 69% of apps fail the OWASP Top 10 in the first scan. A more holistic DevSecOps approach — one that incorporates automation, modular software, scope and continuous feedback — helps organizations achieve a superior position in the marketplace. Simply put, their code becomes a competitive differentiator.  

Best-practice organizations understand that delays due to code defects, a failed product launch, or savage user reviews can severely impact business goals. Application crashes and security breaches directly impact the bottom line. The takeaway is that the need for strategic risk assessment has never been greater. Rather than adopting a defensive and reactive posture, it's wise to focuses on quality throughout the software lifecycle. The move from DevOps to DevSecOps can prove transformative.

Related Posts

By Maria Loughlin

As VP of Engineering, Maria manages the development teams for Veracode’s cloud-based platform and Web Application Security products. Maria joined Veracode in 2012 with 20 years of technical and management experience in companies that include Fidelity Information Services, Memento, Kronos, Open Market and Digital Equipment Corporation. She is known for her high energy, optimism, and pragmatism, and can always be counted upon to call out the elephant in the room! At home Maria appreciates her husband’s hot and spicy cooking and the unfolding drama of parenting tween boys. Maria can be found on Twitter as @marialoughlin.