This tutorial focuses on efficient methods to predictive mon- itoring (PM), the problem of detecting at runtime future violations of a given requirement from the current state of a system. While performing model checking at runtime would offer a precise solution to the PM prob- lem, it is generally computationally expensive. To address this scalabil- ity issue, several lightweight approaches based on machine learning have recently been proposed. These approaches work by learning an approxi- mate yet efficient surrogate (deep learning) model of the expensive model checker. A key challenge remains to ensure reliable predictions, especially in safety-critical applications. We review our recent work on predictive monitoring, one of the first to propose learning-based approximations for CPS verification of tem- poral logic specifications and the first in this context to apply conformal prediction (CP) for rigorous uncertainty quantification. These CP-based uncertainty estimators offer statistical guarantees regarding the gener- alization error of the learning model, and they can be used to deter- mine unreliable predictions that should be rejected. In this tutorial, we present a general and comprehensive framework summarizing our app- roach to the predictive monitoring of CPSs, examining in detail several variants determined by three main dimensions: system dynamics (deter- ministic, non-deterministic, stochastic), state observability, and seman- tics of requirements’ satisfaction (Boolean or quantitative).

Learning-Based Approaches to Predictive Monitoring with Conformal Statistical Guarantees

Cairoli F.
;
Bortolussi L.
;
2023-01-01

Abstract

This tutorial focuses on efficient methods to predictive mon- itoring (PM), the problem of detecting at runtime future violations of a given requirement from the current state of a system. While performing model checking at runtime would offer a precise solution to the PM prob- lem, it is generally computationally expensive. To address this scalabil- ity issue, several lightweight approaches based on machine learning have recently been proposed. These approaches work by learning an approxi- mate yet efficient surrogate (deep learning) model of the expensive model checker. A key challenge remains to ensure reliable predictions, especially in safety-critical applications. We review our recent work on predictive monitoring, one of the first to propose learning-based approximations for CPS verification of tem- poral logic specifications and the first in this context to apply conformal prediction (CP) for rigorous uncertainty quantification. These CP-based uncertainty estimators offer statistical guarantees regarding the gener- alization error of the learning model, and they can be used to deter- mine unreliable predictions that should be rejected. In this tutorial, we present a general and comprehensive framework summarizing our app- roach to the predictive monitoring of CPSs, examining in detail several variants determined by three main dimensions: system dynamics (deter- ministic, non-deterministic, stochastic), state observability, and seman- tics of requirements’ satisfaction (Boolean or quantitative).
File in questo prodotto:
File Dimensione Formato  
RV23_NPM_Tutorial.pdf

Accesso chiuso

Tipologia: Documento in Versione Editoriale
Licenza: Copyright Editore
Dimensione 919.66 kB
Formato Adobe PDF
919.66 kB Adobe PDF   Visualizza/Apri   Richiedi una copia
RV23_NPM_Tutorial.pdf

embargo fino al 01/10/2024

Tipologia: Bozza finale post-referaggio (post-print)
Licenza: Copyright Editore
Dimensione 1.07 MB
Formato Adobe PDF
1.07 MB Adobe PDF   Visualizza/Apri   Richiedi una copia
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11368/3065898
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact