Table of Contents
- Market Categories and Deployment Types
- Key Criteria Comparison
- GigaOm Radar
- Vendor Insights
- Analyst’s Take
- About Iben Rodriguez
- About Geoff Uyleman
The challenges facing IT decision makers when it comes to modern vulnerability management include the integration of DevOps practices and increasing complexity of IT systems.
With DevOps practices and cloud deployments becoming more widespread, the risk posed by vulnerabilities and insecure configurations in legacy workloads and web applications in the cloud continues to increase. In addition, modern IT systems have grown larger and more complex, which makes grappling with large amounts of data increasingly difficult, even as security personnel struggle with the overload of events that can make it difficult to extract actionable intelligence related to business risk and threat context.
When making decisions about vulnerability management products, IT decision makers should consider solutions that can address security issues at scale and reduce the overall vulnerability lifespan, from initial discovery to the final stages of remediation, patching, or image rotation. A successful vulnerability management program prioritizes vulnerabilities based on local context and outside threats to provide actionable insights to developers in their preferred workflow tools for more efficient resolution.
Modern vulnerability management tools focus on security bugs discovered not only during runtime, but also in the build phase of the software design lifecycle (SDLC) when software artifacts are developed. Shifting security left is an automated vulnerability detection and response capability that gets integrated into the developer toolkit as plugins in the IDE or as part of the CI/CD pipeline process. Issues are resolved as soon as possible to avoid the cost and complexity impact of doing this later on in the lifecycle. The SDLC security program aims to optimize the process of building applications, including architectural reviews, static and dynamic code analysis, and the use of software composition analysis (SCA) to examine all artifacts for known vulnerabilities, and to rotate images continuously from development through test to production.
In modern IaaS and PaaS delivery models, vulnerability management tools support the inspection of code that is responsible for the deployment, integration, management, security configuration, and overall compliance of the cloud infrastructure, including microservices. Infrastructure-as-Code should be included in the scope of your vulnerability management program to ensure that pipeline components used to run application microservices are deployed securely.
We have found that the best solutions can ingest vulnerability and local asset information, as well as threat management data, from various sources so as to prioritize recommendations. For patch management, image rotation, or other remediative strategies to be effective, we must prioritize the issues that matter most to your organization. This requires local context, such as critical vulnerabilities found within high-value assets that may contain sensitive data, systems that are exposed to the internet, and determining whether vulnerable packages are actively used by applications. Prioritization also requires external threat intelligence sources that can enrich detections with information about recent vulnerability exploits, and any vulnerabilities known to be exploited for ransomware attacks or that can spread across your network (wormability). This information can help filter out the vulnerabilities that matter most so you can focus on patching and adopting remediative controls to mitigate the possibility of attacks.
Each of the vulnerability management tools we evaluate in this report interacts with different phases of the SDLC. It is important to consider potential feature gaps in coverage when evaluating vulnerability management tools. Visibility into the respective phases of the application lifecycle can provide valuable insights, but it could also result in redundant findings when tools overlap. Duplicate data needs to be correlated and decisions on remediation should be prioritized based on risk, taking into account the threats being faced on a day-to-day basis.
These newer tools help us to be more effective with the limited resources of today’s cybersecurity teams. This is a great opportunity to take a fresh look at how the security operations center (SOC) is staffed and how duties and responsibilities are defined. When bringing on new staff or negotiating a contract with a managed service provider, be sure the upcoming threat landscape is covered from a policy compliance and vulnerability management perspective for the entire SDLC, extending from the developers’ workstations to the build environment to the kubernetes server.
This Radar Report evaluates the capabilities of notable players in the space against the points laid out in the Key Criteria Report.
How to Read this Report
This GigaOm report is one of a series of documents that helps IT organizations assess competing solutions in the context of well-defined features and criteria. For a fuller understanding consider reviewing the following reports:
Key Criteria report: A detailed market sector analysis that assesses the impact that key product features and criteria have on top-line solution characteristics—such as scalability, performance, and TCO—that drive purchase decisions.
GigaOm Radar report: A forward-looking analysis that plots the relative value and progression of vendor solutions along multiple axes based on strategy and execution. The Radar report includes a breakdown of each vendor’s offering in the sector.
Vendor Profile: An in-depth vendor analysis that builds on the framework developed in the Key Criteria and Radar reports to assess a company’s engagement within a technology sector. This analysis includes forward-looking guidance around both strategy and product.