| CARVIEW |
Workshop on Quantitative Cybersecurity (QuCS)
The workshop addresses the core concepts of Quantitative Cybersecurity (QuCS).
QuCS is driven by the necessity to provide system-level security metrics and address the challenge identified by key organizations as one of the "hard problems" in security. Developing robust and quantitative security metrics has been recognized as one of the hard problems in cybersecurity by key organizations. Conventional metrics often capture only static aspects of security. This workshop addresses this challenge by exploring the emerging field of Cybersecurity Dynamics (CD).
Computing systems are widely dynamic, thus it is important to focus on the evolution of the global cybersecurity state, affected by continuous cyber attack-defense interactions. Understanding this evolution is crucial because cyber attacks are inevitable, and defenders must know the dynamic security states to manage risk effectively.
The core research objectives of QuCS are centered on achieving models with descriptive, prescriptive, and predictive capabilities regarding this evolution. This framework naturally leads to the notion of macroscopic cybersecurity, where model parameters abstract the power of microscopic attack/defense mechanisms.
Computing systems are widely dynamic and continuously affected by cyber attack–defense interactions. Understanding the evolution of cybersecurity states is crucial, as cyber attacks are inevitable and defenders must manage risk based on time-dependent security conditions.
The core research objectives of QuCS are centered on achieving models with descriptive, prescriptive, and predictive capabilities regarding this evolution.
This framework naturally leads to the notion of macroscopic cybersecurity, where model parameters abstract the power of microscopic attack and defense mechanisms.
Four time-dependent sub-metrics: vulnerabilities, defense power, attack severity, and situation understanding.
Time-dependent analysis of phenomena such as Time-To-Compromise (TTC/MTTC) using stochastic process models.
Leveraging frameworks such as TRAM (Trust, Resilience, and Agility Metrics).
The workshop will cover the following key topics in quantitative and dynamic cybersecurity:
Cybersecurity Dynamics (CD) Concept: Defining CD as the evolution of the global security state S(t) over time, viewing it as a "natural phenomenon" in cyberspace.
The Quantitative Metric Framework: Proposing a framework for system-level security measurement based on four key time-dependent sub-metrics derived from attack-defense interactions: (1) System Vulnerabilities V(t), (2) Defense Power D(t), (3) Attack/Threat Severity A(t), and (4) Situations situation(t). The outcome situation(t) can be represented as a mathematical function of the other three metrics: situation(t)=f(V(t),D(t),A(t)).
Modeling Challenges and Barriers: Discussing the intrinsic technical hurdles in studying CD, including the scalability barrier (exponentially large state space), nonlinearity barrier (highly nonlinear system dependence), dependence barrier (due to common vulnerabilities), and non-equilibrium (transient behavior) barrier.
Emergent Behavior: Analyzing the fundamental implication that emergent behavior is inherent to cybersecurity, where security properties of a system are not possessed or implied by its lower-level components ("1+1 > 2" effect).
Time-To-Compromise (TTC/MTTC): Analysis of the Mean Time-To-Compromise (MTTC) as a crucial metric for estimating the effort and time required for an attacker to compromise a system component.
Composite Stochastic Process Model: Modeling TTC as a random process composite of three subprocesses: (1) Exploit readily available, (2) Exploit must be found or written, and (3) Identification of new vulnerabilities and exploits (running in parallel).
Uncertainty and Distribution: The rationale for representing TTC (or TTCICS, for Industrial Control Systems) as a probability distribution (e.g., log-normal distribution, based on prior studies) due to the often-unknown attacker skill level (novice, beginner, intermediate, expert).
Dynamic Defense Models: Overview of quantitative analysis using stochastic process models (cyber epidemic dynamics) to derive macroscopic security metrics such as the probability that a node is compromised at time t or the expected number of compromised nodes at time t.
Limits of Predictability: Exploration of active cyber defense dynamics models demonstrating phenomena like bifurcation and chaos, indicating a fundamental limit on cybersecurity measurement and predictability in certain parameter regimes.
The TRAM Framework: Introduction of the Trust, Resilience, and Agility Metrics (TRAM) framework used to measure the multidimensional quality and trustworthiness of systems, encompassing security, dependability, and human factors.
Cyber Agility Metrics: Quantifying cyber agility, defined as the ability of an entity to be effective when facing a dynamic situation or unexpected circumstances. This involves systematic measurement of attack and defense evolution generations.
Timeliness-Oriented Agility Metrics: Detailed definitions and application of time-based metrics for dynamic strategies, including:
Generation-Time (GT): Time between two consecutive strategy evolutions.
Effective-Generation-Time (EGT): Time taken to evolve a generation that increases effectiveness against the opponent.
Triggering-Time (TT): Time elapsed since an opponent's reference generation that may have triggered a particular response.
Lagging-Behind-Time (LBT): Measures how far one party lags behind its opponent with respect to a reference time.
SALVATORE DISTEFANO, Full Professor at the University of Messina, Department of Mathematics and Computer Sciences, Physical Sciences and Earth Sciences, sdistefano@unime.it
MAURIZIO GIACOBBE, Assistant Professor at the University of Messina, Department of Mathematics and Computer Sciences, Physical Sciences and Earth Sciences, mgiacobbe@unime.it