Stochastic control of optimized certainty equivalents


Abstract in English

Optimized certainty equivalents (OCEs) is a family of risk measures widely used by both practitioners and academics. This is mostly due to its tractability and the fact that it encompasses important examples, including entropic risk measures and average value at risk. In this work we consider stochastic optimal control problems where the objective criterion is given by an OCE risk measure, or put in other words, a risk minimization problem for controlled diffusions. A major difficulty arises since OCEs are often time inconsistent. Nevertheless, via an enlargement of state space we achieve a substitute of sorts for time consistency in fair generality. This allows us to derive a dynamic programming principle and thus recover central results of (risk-neutral) stochastic control theory. In particular, we show that the value of our risk minimization problem can be characterized via the viscosity solution of a Hamilton--Jacobi--Bellman--Issacs equation. We further establish the uniqueness of the latter under suitable technical conditions.

Download