Skip to main content

Articles & Blogs

Measuring Risk: The Quantification Imperative

In part 1 of this series, we discussed the fundamental paradox of risk in asset management, where despite a universal conceptual understanding of risk, organizations consistently fail to manage it effectively. This allows early warning signs to go unheeded until disasters strike. We emphasized how excessive complexity in risk management systems ironically becomes a risk factor itself, consuming resources with bureaucratic processes rather than actual risk mitigation.   

In this blog, the second part of the series, we shift our focus to the measurement methodologies that underpin risk management. 

It is difficult to track and control what cannot be measured. This is particularly relevant in the domain of asset management, where organizations must implement robust methodologies to safeguard their resources against potential threats. 

So, in this installment, we investigate the evolution from traditional qualitative risk matrices to modern quantitative approaches, while critically assessing their respective strengths, limitations and the dangerous allure of methodological determinism in decision-making.  

 

Qualitative Risk Assessment 

Historically, capital-intensive organizations have relied predominantly on qualitative risk assessment frameworks, most notably color-coded risk matrixes that categorize potential threats using ordinal classifications such as "high" "medium" and "low." These matrixes, which originated in the petroleum industry before gaining wider adoption, typically present likelihood on one axis (ranging from "rare" to "likely") and consequence severity on the other (scaled from minimal financial impact to losses exceeding US $20 million). By assigning numerical values to each position and employing ‘traffic light’ signals—green denoting acceptable risk, yellow indicating cautionary status and red signaling immediate intervention—this method and associated tools have been adopted across numerous sectors. 

Quantitative Risk Assessment 

In the past 15 years, we have witnessed a paradigm shift toward quantitative risk analysis methodologies. Unlike their qualitative counterparts, these approaches employ precise numerical values to express both probability (rendered as percentages) and impact (calculated in concrete financial terms). This quantitative framework enables the derivation of specific risk values through the multiplication of probability and consequence. This yields actionable metrics for comparative analysis. For instance, an event with a 1% probability of occurrence and a US $10 million potential loss represents a quantifiable risk value of $100,000. This calculation provides decision-makers with concrete data points for resource allocation and mitigation strategies.   

Is One More Effective Than the Other? 

Despite the ubiquity of risk matrixes in organizational practice, emerging research has identified significant limitations in their application. The article "The Risk of Using Risk Matrices" describes several intrinsic flaws:  

  • Risk-acceptance inconsistency, wherein outcomes with lower expected losses may be prioritized over those with higher expected losses.
  • Range compression, which misrepresents the magnitude of various risks
  • Centering bias, where evaluators avoid extreme values, thereby compressing the scale
  • Category-definition bias, resulting from subjective interpretations of qualitative descriptors such as "likely" or "severe."  

These deficiencies collectively undermine the reliability of risk matrixes as decision-making instruments. Where deficiencies exist, we can infer a lack of effectiveness, thereby increasing risk exposure. In his book, “The Failure of Risk Management: Why It's Broken and How to Fix It”, Doug Hubbard, renowned expert on risk management, says, "Except for certain quantitative methods in certain industries, the effectiveness of risk management is almost never measured." - 

The ISO 55000 standard offers comprehensive guidance on integrating risk management within asset management frameworks. According to this standard, effective asset management necessitates defining functional and performance requirements, quantifying risk exposure, implementing appropriate mitigation measures and maintaining continuous monitoring protocols to ensure risk remains within acceptable parameters. The standard also emphasizes the importance of "systematically identifying, evaluating and controlling risks and opportunities, ensuring appropriate financial outcomes, and enhancing asset and service resilience, occupational health and safety, and environmental and social impact." ISO 55000:2024

Decision Making 

Yet, it is imperative to acknowledge the potential dangers of methodological determinism, regardless of whether one employs risk matrixes or sophisticated quantitative approaches. When analytical frameworks become the sole arbiters of decision-making, critical contextual and ethical considerations may be overlooked.  

The cinematic illustration from the 2004 sci-fi movie ‘I, Robot’ provides a poignant example. The main character, Detective Spooner, recounts an incident wherein a robot, employing purely quantitative reasoning, rescued him rather than a young girl because "I had a 45% chance of survival. Sarah only had an 11% chance.”  Spooner questions whether a percentage alone should determine action. 

Both qualitative and quantitative methodologies can create an illusory sense of objectivity and precision. The aforementioned research by Thomas et al demonstrates how risk matrixes produce arbitrary results despite their authoritative appearance. Similarly, even sophisticated quantitative methods require human judgment for interpretation and consideration of factors resistant to quantification. Just like the robot in the sci-fi movie reference above, is it appropriate to make a quantitative yet coldly objective choice?  

Effective decision-making necessitates a nuanced understanding of both the capabilities and limitations of analytical methods and the outcomes they produce. While analysis can illuminate certain aspects of a situation, quantify specific factors and identify patterns, optimal decisions must integrate these insights with contextual knowledge, ethical considerations, and human judgment that transcend methodological boundaries.  

Asset Risk Analyzer 

We invite you to explore the Asset Risk Analyzer, a free, DIY tool that can help you use information you have to quantify asset risk exposure.  In a few simple steps, the Asset Risk Analyzer can produce an expected value calculation that produces a monetized directional value of how much asset risk you are exposed to. It will contrast your asset data to comparison models in its library of similar assets.  Besides identifying your under-performing assets, it can calculate the magnitude of opportunity.   

Are you ready to elevate your visibility to asset risk, cost and performance opportunities? Experience the capabilities of Asset Risk Analyzer for yourself—take the first step toward a more systematic way to safely maximize predictable production and the lowest sustainable cost. 

Try it for free today 

About the Author

Asset management domain expert committed to taking the fun and excitement out of asset management. Three decades of international standards, enterprise advisory, digital solutions, and implementation experience. Helped deliver asset management solutions to water services sector, electric utilities, power generation, process manufacturing, mining, chemicals, and fleet organizations on six continents. Marc is a contributing member to ISO Technical Committee 251 since 2010, representing the interests of the USA. He served leadership roles including of Chair of ANSI Technical Advisory Group, and first Convener of the International Standard ISO 55011, Guidance for development and application of public policy to enable asset management.

Profile Photo of Marc Laplante