Synthesis of Maximally Permissive Covert Attackers Against Unknown Supervisors by Using Observations


الملخص بالإنكليزية

In this paper, we consider the problem of synthesis of maximally permissive covert damage-reachable attackers in the setup where the model of the supervisor is unknown to the adversary but the adversary has recorded a (prefix-closed) finite set of observations of the runs of the closed-loop system. The synthesized attacker needs to ensure both the damage-reachability and the covertness against all the supervisors which are consistent with the given set of observations. There is a gap between the de facto maximal permissiveness, assuming the model of the supervisor is known, and the maximal permissiveness that can be attained with a limited knowledge of the model of the supervisor, from the adversarys point of view. We consider the setup where the attacker can exercise sensor replacement/deletion attacks and actuator enablement/disablement attacks. The solution methodology proposed in this work is to reduce the synthesis of maximally permissive covert damage-reachable attackers, given the model of the plant and the finite set of observations, to the synthesis of maximally permissive safe supervisors for certain transformed plant, which shows the decidability of the observation-assisted covert attacker synthesis problem. The effectiveness of our approach is illustrated on a water tank example adapted from the literature.

تحميل البحث