Skip to main content

Articles & Blogs

The Impact of the NVD's Backlog on OT Environments: Is It a Significant Concern?

The National Vulnerability Database (NVD) is currently grappling with a backlog of over 17,000 unenriched Common Vulnerabilities and Exposures (CVE) reports. This backlog signifies that many newly reported vulnerabilities lack critical information, such as their impact, severity and potential mitigations. For operational technology (OT) environments and industrial control systems (ICS), this situation brings both challenges and opportunities to re-evaluate vulnerability management strategies. 

Understanding the NVD’s Backlog 

At its core, the NVD aims to provide a comprehensive repository of vulnerabilities, offering a centralized source of information for cybersecurity professionals. However, the reality is that the database is far from perfect. The backlog of unenriched CVEs reflects the inherent challenges in maintaining a complete and up-to-date repository. The key issues include: 

  1. Incomplete Data: New vulnerabilities often come with a delay in receiving detailed information, such as impact assessments and mitigation strategies. This gap leaves security teams with an incomplete picture of the potential risks posed by these vulnerabilities until the necessary details are available. 

  1. Accuracy Limitations: The NVD has never been 100% accurate or timely. The dynamic nature of cybersecurity means that vulnerabilities and threats are constantly evolving, making perfect accuracy an unrealistic expectation. 

  1. Resource Constraints: The process of enriching CVEs with detailed information requires significant human and technical resources. Given the sheer volume of new vulnerabilities discovered daily, the NVD often struggles to keep up, leading to a backlog. 

  1. Integration with Other Databases: The NVD is not the only vulnerability database. There are others, such as the Exploit Database and vendor-specific databases. Ensuring consistency and integration across these platforms can be challenging, leading to discrepancies and gaps in information. 

 

Implications for OT Environments 

In OT environments, where industrial control systems are crucial for operational continuity, the implications of this backlog are nuanced: 

  1. Prioritization Challenges: The absence of detailed information can hinder the ability to effectively prioritize vulnerabilities. In OT settings, where systems are often highly specialized and critical, understanding the potential impact of vulnerabilities is essential for prioritizing remediation or, at minimum, to work around mitigation efforts.  

  1. Batch Processing of Vulnerabilities: Many organizations in OT environments identify, evaluate and address vulnerabilities in batches rather than in real-time. This approach reflects the practical realities of OT operations, where addressing each vulnerability individually can be impractical and resource intensive. 

The Case for Lag in Assessment 

The concept of assessing vulnerabilities in batches, rather than in real-time, has its advantages: 

  1. Noise Reduction: Real-time or near real-time assessments can generate excessive noise, leading to alert fatigue and potentially distracting security teams from addressing the most critical vulnerabilities. A lag allows for a more considered approach, reducing the risk of becoming overwhelmed by a flood of new information. 

  1. Risk Management: The truth is a lag in data enrichment can better risk management. By allowing time for detailed analysis and contextual understanding, security teams can focus on vulnerabilities that pose the greatest risk, rather than chasing after every newly discovered CVE. 

Focus on Known Exploits 

In OT environments, often a small fraction of vulnerabilities detected have known exploits  emphasizing that discovering new, unexploited CVEs might not always translate into immediate risk reduction. 

  1. Effective Risk Reduction: Addressing vulnerabilities with known exploits should be the primary focus, as these pose a direct threat. Investing resources into understanding and mitigating these vulnerabilities can yield more significant risk reductions compared to addressing unenriched CVEs that may not have immediate or known impacts. 

  1. Augmentation of Data: While the NVD’s data is valuable, relying solely on it without augmentation can be limiting. Developing internal processes for augmenting NVD data, coupled with focusing on known exploitable vulnerabilities can enhance an organization’s vulnerability management strategyBelow is an example of how Hexagon enriches the NVD data.  

A diagram of a computer program

Description automatically generated 

 

Conclusion 

The NVD’s backlog of unenriched CVEs poses challenges but is not an insurmountable issue for OT environments. The key lies in balancing the need for accurate and timely information with practical risk management strategies. By focusing on known exploits and leveraging a lag in data enrichment to reduce noise, organizations can better allocate their resources and enhance overall cybersecurity posture. 

About the Author

Edward Liebig is the Global Director Cyber Ecosystem in Hexagon’s Asset Lifecycle Intelligence division. His career spans over four decades, with over 30 of those years focused on cybersecurity. He has led as Chief Information Security Officer and cybersecurity captain for several multinational companies. He's also led Professional and Managed Security Services for the US critical infrastructure sector for two Global System Integrators. With this unique perspective Edward leads the Cybersecurity Alliances for Hexagon PAS Cyber Integrity. In this role he leverages his diverse experience to forge partnerships with service providers and technologies that drive collective strengths to best address our client’s security needs. Mr. Liebig is an adjunct professor at Washington University in St. Louis and teaches as part of the Master of Cybersecurity Management degree program.

Profile Photo of Edward Liebig