No Arabic abstract
The key element of the approach to the theory of necessary conditions in optimal control discussed in the paper is reduction of the original constrained problem to unconstrained minimization with subsequent application of a suitable mechanism of local analysis to characterize minima of (necessarily nonsmooth) functionals that appear after reduction. Using unconstrained minimization at the crucial step of obtaining necessary conditions definitely facilitates studies of new phenomena and allows to get more transparent and technically simple proofs of known results. In the paper we offer a new proof of the maximum principle for a nonsmooth optimal control problem (in the standard Pontryagin form) with state constraints and then prove a new second order condition for a strong minimum in the same problem but with data differentiable with respect to the state and control variables. The role of variational analysis is twofold. Conceptually, the main considerations behind the reduction are connected with metric regularity and Ekelands principle. On the other hand, technically, calculation of subdifferentials of components of the functionals that appear after the reduction is an essential part of the proofs.
We present a first step towards a multigrid method for solving the min-cost flow problem. Specifically, we present a strategy that takes advantage of existing black-box fast iterative linear solvers, i.e. algebraic multigrid methods. We show with standard benchmarks that, while less competitive than combinatorial techniques on small problems that run on a single core, our approach scales well with problem size, complexity, and number of processors, allowing for tackling large-scale problems on modern parallel architectures. Our approach is based on combining interior-point with multigrid methods for solving the nonlinear KKT equations via Newtons method. However, the Jacobian matrix arising in the Newton iteration is indefinite and its condition number cannot be expected to be bounded. In fact, the eigenvalues of the Jacobian can both vanish and blow up near the solution, leading to a significant slow-down of the convergence speed of iterative solvers - or to the loss of convergence at all. In order to allow for the application of multigrid methods, which have been originally designed for elliptic problems, we furthermore show that the occurring Jacobian can be interpreted as the stiffness matrix of a mixed formulation of the weighted graph Laplacian of the network, whose metric depends on the slack variables and the multipliers of the inequality constraints. Together with our regularization, this allows for the application of a black-box algebraic multigrid method on the Schur-complement of the system.
We consider a network of prosumers involved in peer-to-peer energy exchanges, with differentiation price preferences on the trades with their neighbors, and we analyze two market designs: (i) a centralized market, used as a benchmark, where a global market operator optimizes the flows (trades) between the nodes, local demand and flexibility activation to maximize the system overall social welfare; (ii) a distributed peer-to-peer market design where prosumers in local energy communities optimize selfishly their trades, demand, and flexibility activation. We first characterizethe solution of the peer-to-peer market as a Variational Equilibrium and prove that the set of Variational Equilibria coincides with the set of social welfare optimal solutions of market design (i). We give several results that help understanding the structure of the trades at an equilibriumor at the optimum. We characterize the impact of preferences on the network line congestion and renewable energy waste under both designs. We provide a reduced example for which we give the set of all possible generalized equilibria, which enables to give an approximation of the price ofanarchy. We provide a more realistic example which relies on the IEEE 14-bus network, for which we can simulate the trades under different preference prices. Our analysis shows in particular that the preferences have a large impact on the structure of the trades, but that one equilibrium(variational) is optimal.
Variational inequalities are modelling tools used to capture a variety of decision-making problems arising in mathematical optimization, operations research, game theory. The scenario approach is a set of techniques developed to tackle stochastic optimization problems, take decisions based on historical data, and quantify their risk. The overarching goal of this manuscript is to bridge these two areas of research, and thus broaden the class of problems amenable to be studied under the lens of the scenario approach. First and foremost, we provide out-of-samples feasibility guarantees for the solution of variational and quasi variational inequality problems. Second, we apply these results to two classes of uncertain games. In the first class, the uncertainty enters in the constraint sets, while in the second class the uncertainty enters in the cost functions. Finally, we exemplify the quality and relevance of our bounds through numerical simulations on a demand-response model.
In order to develop statistical methods for shapes with a tree-structure, we construct a shape space framework for tree-like shapes and study metrics on the shape space. This shape space has singularities, corresponding to topological transitions in the represented trees. We study two closely related metrics on the shape space, TED and QED. QED is a quotient Euclidean distance arising naturally from the shape space formulation, while TED is the classical tree edit distance. Using Gromovs metric geometry we gain new insight into the geometries defined by TED and QED. We show that the new metric QED has nice geometric properties which facilitate statistical analysis, such as existence and local uniqueness of geodesics and averages. TED, on the other hand, does not share the geometric advantages of QED, but has nice algorithmic properties. We provide a theoretical framework and experimental results on synthetic data trees as well as airway trees from pulmonary CT scans. This way, we effectively illustrate that our framework has both the theoretical and qualitative properties necessary to build a theory of statistical tree-shape analysis.
In this paper, we discuss and review how combined multi-view imagery from satellite to street-level can benefit scene analysis. Numerous works exist that merge information from remote sensing and images acquired from the ground for tasks like land cover mapping, object detection, or scene understanding. What makes the combination of overhead and street-level images challenging, is the strongly varying viewpoint, different scale, illumination, sensor modality and time of acquisition. Direct (dense) matching of images on a per-pixel basis is thus often impossible, and one has to resort to alternative strategies that will be discussed in this paper. We review recent works that attempt to combine images taken from the ground and overhead views for purposes like scene registration, reconstruction, or classification. Three methods that represent the wide range of potential methods and applications (change detection, image orientation, and tree cataloging) are described in detail. We show that cross-fertilization between remote sensing, computer vision and machine learning is very valuable to make the best of geographic data available from Earth Observation sensors and ground imagery. Despite its challenges, we believe that integrating these complementary data sources will lead to major breakthroughs in Big GeoData.