Robustness refers to the ability of a system to behave reliably in the presence of perturbation in the system parameters or irregularities in the system's operating environment. This is particularly important in the context of embedded systems and software, which interact with a physical environment through sensors and actuators and communicate over wired or wireless networks. Such systems are routinely subject to deviations arising from sensor or actuation noise, quantization and sampling of data, uncertainty in the physical environment, and delays or packet drops over unreliable network channels. When deployed in safety critical applications, system robustness in the presence of uncertainty is not just desirable, but crucial.
Our aim is to foster dialogue and exchange of ideas and techniques across several disciplines with an interest in robustness such as formal verification, programming languages, fault-tolerance, control theory and hybrid systems. The first DARS workshop will consist of a number of invited speakers who will share their diverse perspectives on the robustness problem. The workshop will be informal, with plenty of time for Q&A and brainstorming sessions.
Early registration: ***18 March, 2016***
Workshop: 11 April, 2016
April 11, 2016
- 8:45-9:00: Welcome
- 9:00-9:45: Roderick Bloem
- 9:45-10:30: Georgios Fainekos
10:30-11:00: ***Coffee Break***
- 11:00-11:45: Antoine Girard
- 11:45-12:30: Matthias Rungger
12:30-14:00: ***Lunch Break***
- 14:00-14:45: Jan Otop
- 14:45-15:30: Pavithra Prabhakar
15:30-16:00: ***Coffee Break***
- 16:00-16:45:Ulrich Schmid
- 16:45-17:30: Discussion and closing remarks
Formal property verification in a conformance testing
In Model-Based Design (MBD) of Cyber-Physical Systems (CPS), such as switched mixed-signal circuits or software-controlled physical systems, it is common to develop a sequence of system models of different fidelity and complexity, each appropriate for a particular design or verification task. In such a sequence, one model is often derived from the other by a process of simplification or implementation. E.g. a Simulink model might be implemented on an embedded processor via automatic code generation. Three questions naturally present themselves: How do we quantify closeness between the two systems? How can we measure such closeness? If the original system satisfies some formal property, can we automatically infer what properties are then satisfied by the derived model? This talk addresses all three questions: We quantify the closeness between original and derived model via a distance measure between their outputs. We then propose two computational methods for approximating this closeness measure. Finally, we derive syntactical re-writing rules which, when applied to a Metric Temporal Logic (MTL) specification satisfied by the original model, produce a formula satisfied by the derived model. We demonstrate our framework through several experiments.
Robust Stability Verification and Timing Contract
Synthesis for Linear Impulsive Systems using Reachability Analysis
This talk deals with stability analysis for a class of linear impulsive systems subject to a timing contract specifying bounds on the time between two consecutive impulses. We consider the problem of stability verification, which consists in proving stability for a particular timing contract, and the problem of timing contract synthesis, which consists in synthesizing a set of timing contracts that guarantee the stability of the linear impulsive system.
Our approach is based on a reformulation using parameterized difference inclusions. We derive theoretical necessary and sufficient conditions for stability based on the propagation of a set by the system dynamics. For linear impulsive systems, this allow us to design a stability verification algorithm using reachability analysis.
We then propose an approach to timing contract synthesis, which exploits the monotonicity of stability with respect to timing contract parameters to design an algorithm based on adaptive sampling of the parameter space. Several examples are provided, which allow us to compare our algorithm with several existing techniques, and show the effectiveness of our approach.
Automata-based approach to measuring robustness
We regard a system robust if it continues to work correctly despite of
perturbations. The perturbation model is crucial here; we
therefore refer to robustness of a system with respect to
certain specific perturbations. Also, it is unlikely that a
system is completely immune to perturbations. We therefore
measure the level of (specific) perturbations tolerated by the
system, i.e., we consider robustness as a quantitative
In this talk, we present an automata-based approach to robustness,
where perturbations are modeled by weighted automata. The
values returned by weighted automata correspond to the level of
perturbations, which gives raise to quantitative approach to
robustness. We consider two types of systems, closed and
reactive systems, with two different frameworks capturing
robustness: model measuring for closed systems and Lipschitz
continuity for reactive systems. Both, closed and reactive
systems are considered in two variants, discrete and
continuous. While we briefly describe these frameworks, we
focus primarily on the design choices leading to these
A Notion of Robustness for Cyber-Physical Systems
Robustness as a system property describes the degree to which a system is able
to function correctly in the presence of disturbances, i.e., unforeseen or
erroneous inputs. In this talk, we present a notion of robustness termed
input-output dynamical stability for cyber-physical systems (CPS) which merges
existing notions of robustness for continuous systems and discrete systems.
The notion captures two intuitive aims of robustness: bounded disturbances have
bounded effects and the consequences of a sporadic disturbance disappear over
time. We present a design methodology for robust CPS which is based on an
abstraction and refinement process. We suggest several novel notions of
simulation relations to ensure the soundness of the approach. In addition, we
show how such simulation relations can be constructed compositionally. The
different concepts and results are illustrated throughout with some small
Reconciling Fault-Tolerance and Robustness?
In this talk, I will provide an overview of classic fault-tolerance
techniques, with a distinct focus on fault-tolerant distributed
algorithms and self-stabilization. Emphasis will also be put on
assumption coverage analysis and how it complements correctness
proofs. I will relate classic fault-tolerance properties, which are
essentially all-or-nothing, to robustness properties, which are
viewed as continuity properties. Instances of gracefully degrading
distributed algorithms will be used to demonstrate that
fault-tolerance and robustness can be reconciled, albeit such
solutions must still be considered an exception today.
Algorithmic Verification of Stability of Hybrid Systems
Hybrid systems refer to systems exhibiting mixed discrete-continuous behaviors and arise as a natural byproduct of the interaction of a network of embedded processors with physical systems.
In this talk, we focus on the verification of stability of hybrid
systems. Stability is a fundamental property in control system
design and captures the notion that small perturbations to the
initial state or input of the system result in only small
variations in the behavior of the system. We present
foundations and automated methods for approximation based
analysis of stability including novel predicate abstraction and
hybridization techniques. In contrast to the well-known methods
for stability analysis based on Lyapunov functions, which are
deductive, we present algorithmic techniques.