No Arabic abstract
This article treats optimal sparse control problems with multiple constraints defined at intermediate points of the time domain. For such problems with intermediate constraints, we first establish a new Pontryagin maximum principle that provides first order necessary conditions for optimality in such problems. Then we announce and employ a new numerical algorithm to arrive at, in a computationally tractable fashion, optimal state-action trajectories from the necessary conditions given by our maximum principle. Several detailed illustrative examples are included.
This paper introduces and studies the optimal control problem with equilibrium constraints (OCPEC). The OCPEC is an optimal control problem with a mixed state and control equilibrium constraint formulated as a complementarity constraint and it can be seen as a dynamic mathematical program with equilibrium constraints. It provides a powerful modeling paradigm for many practical problems such as bilevel optimal control problems and dynamic principal-agent problems. In this paper, we propose weak, Clarke, Mordukhovich and strong stationarities for the OCPEC. Moreover, we give some sufficient conditions to ensure that the local minimizers of the OCPEC are Fritz John type weakly stationary, Mordukhovich stationary and strongly stationary, respectively. Unlike Pontryagains maximum principle for the classical optimal control problem with equality and inequality constraints, a counter example shows that for general OCPECs, there may exist two sets of multipliers for the complementarity constraints. A condition under which these two sets of multipliers coincide is given.
In this paper we study an optimal control problem with nonsmooth mixed state and control constraints. In most of the existing results, the necessary optimality condition for optimal control problems with mixed state and control constraints are derived under the Mangasarian-Fromovitz condition and under the assumption that the state and control constraint functions are smooth. In this paper we derive necessary optimality conditions for problems with nonsmooth mixed state and control constraints under constraint qualifications based on pseudo-Lipschitz continuity and calmness of certain set-valued maps. The necessary conditions are stratified, in the sense that they are asserted on precisely the domain upon which the hypotheses (and the optimality) are assumed to hold. Moreover necessary optimality conditions with an Euler inclusion taking an explicit multiplier form are derived for certain cases.
In this paper, we investigate a sparse optimal control of continuous-time stochastic systems. We adopt the dynamic programming approach and analyze the optimal control via the value function. Due to the non-smoothness of the $L^0$ cost functional, in general, the value function is not differentiable in the domain. Then, we characterize the value function as a viscosity solution to the associated Hamilton-Jacobi-Bellman (HJB) equation. Based on the result, we derive a necessary and sufficient condition for the $L^0$ optimality, which immediately gives the optimal feedback map. Especially for control-affine systems, we consider the relationship with $L^1$ optimal control problem and show an equivalence theorem.
In this article, we derive first-order necessary optimality conditions for a constrained optimal control problem formulated in the Wasserstein space of probability measures. To this end, we introduce a new notion of localised metric subdifferential for compactly supported probability measures, and investigate the intrinsic linearised Cauchy problems associated to non-local continuity equations. In particular, we show that when the velocity perturbations belong to the tangent cone to the convexification of the set of admissible velocities, the solutions of these linearised problems are tangent to the solution set of the corresponding continuity inclusion. We then make use of these novel concepts to provide a synthetic and geometric proof of the celebrated Pontryagin Maximum Principle for an optimal control problem with inequality final-point constraints. In addition, we propose sufficient conditions ensuring the normality of the maximum principle.
This paper considers dynamic networks where vertices and edges represent manifest signals and causal dependencies among the signals, respectively. We address the problem of how to determine if the dynamics of a network can be identified when only partial vertices are measured and excited. A necessary condition for network identifiability is presented, where the analysis is performed based on identifying the dependency of a set of rational functions from excited vertices to measured ones. This condition is further characterised by using an edge-removal procedure on the associated bipartite graph. Moreover, on the basis of necessity analysis, we provide a necessary and sufficient condition for identifiability in circular networks.