ﻻ يوجد ملخص باللغة العربية
Partially Observable Markov Decision Process (POMDP) is widely used to model probabilistic behavior for complex systems. Compared with MDPs, POMDP models a system more accurate but solving a POMDP generally takes exponential time in the size of its state space. This makes the formal verification and synthesis problems much more challenging for POMDPs, especially when multiple system components are involved. As a promising technique to reduce the verification complexity, the abstraction method tries to find an abstract system with a smaller state space but preserves enough properties for the verification purpose. While abstraction based verification has been explored extensively for MDPs, in this paper, we present the first result of POMDP abstraction and its refinement techniques. The main idea follows the counterexample-guided abstraction refinement (CEGAR) framework. Starting with a coarse guess for the POMDP abstraction, we iteratively use counterexamples from formal verification to refine the abstraction until the abstract system can be used to infer the verification result for the original POMDP. Our main contributions have two folds: 1) we propose a novel abstract system model for POMDP and a new simulation relation to capture the partial observability then prove the preservation on a fragment of Probabilistic Computation Tree Logic (PCTL); 2) to find a proper abstract system that can prove or disprove the satisfaction relation on the concrete POMDP, we develop a novel refinement algorithm. Our work leads to a sound and complete CEGAR framework for POMDP.
We study strategy synthesis for partially observable Markov decision processes (POMDPs). The particular problem is to determine strategies that provably adhere to (probabilistic) temporal logic constraints. This problem is computationally intractable
We study finite-state controllers (FSCs) for partially observable Markov decision processes (POMDPs) that are provably correct with respect to given specifications. The key insight is that computing (randomised) FSCs on POMDPs is equivalent to - and
Numerical tools for constraints solving are a cornerstone to control verification problems. This is evident by the plethora of research that uses tools like linear and convex programming for the design of control systems. Nevertheless, the capability
We present a probabilistic version of PCF, a well-known simply typed universal functional language. The type hierarchy is based on a single ground type of natural numbers. Even if the language is globally call-by-name, we allow a call-by-value evalua
Markov Decision Processes (MDPs) are a popular class of models suitable for solving control decision problems in probabilistic reactive systems. We consider parametric MDPs (pMDPs) that include parameters in some of the transition probabilities to ac