ترغب بنشر مسار تعليمي؟ اضغط هنا

Least-Squares Approximation by Elements from Matrix Orbits Achieved by Gradient Flows on Compact Lie Groups

74   0   0.0 ( 0 )
 نشر من قبل Thomas Schulte-Herbr\\\"uggen
 تاريخ النشر 2008
  مجال البحث
والبحث باللغة English




اسأل ChatGPT حول البحث

Let $S(A)$ denote the orbit of a complex or real matrix $A$ under a certain equivalence relation such as unitary similarity, unitary equivalence, unitary congruences etc. Efficient gradient-flow algorithms are constructed to determine the best approximation of a given matrix $A_0$ by the sum of matrices in $S(A_1), ..., S(A_N)$ in the sense of finding the Euclidean least-squares distance $$min {|X_1+ ... + X_N - A_0|: X_j in S(A_j), j = 1, >..., N}.$$ Connections of the results to different pure and applied areas are discussed.

قيم البحث

اقرأ أيضاً

The aim of this work is the application of the Meshfree methods for solving systems of stiff ordinary differential equations. These methods are based on the Moving least squares (MLS), generalized moving least squares (GMLS) approximation and Modifie d Moving least squares (MMLS) method. GMLS makes a considerable reduction in the cost of numerical methods. In fact, GMLS method is effect operator on the basis polynomial rather than the complicated MLS shape functions. Besides that the modified MMLS approximation method avoids undue a singular moment matrix. This allows the base functions to be of order greater than two with the same size of the support domain as the linear base functions. We also show the estimation of the error propagation obtained of the numerical solution of the systems of stiff ordinary differential equation. Some examples are provided to show that the GMLS and MMLS methods are more reliable (accurate) than classic MLS method.Finally, the (our) proposed methods are validated by solving ZIKV model which is a system of ODEs.
61 - Barak Sober , David Levin 2016
In order to avoid the curse of dimensionality, frequently encountered in Big Data analysis, there was a vast development in the field of linear and nonlinear dimension reduction techniques in recent years. These techniques (sometimes referred to as m anifold learning) assume that the scattered input data is lying on a lower dimensional manifold, thus the high dimensionality problem can be overcome by learning the lower dimensionality behavior. However, in real life applications, data is often very noisy. In this work, we propose a method to approximate $mathcal{M}$ a $d$-dimensional $C^{m+1}$ smooth submanifold of $mathbb{R}^n$ ($d ll n$) based upon noisy scattered data points (i.e., a data cloud). We assume that the data points are located near the lower dimensional manifold and suggest a non-linear moving least-squares projection on an approximating $d$-dimensional manifold. Under some mild assumptions, the resulting approximant is shown to be infinitely smooth and of high approximation order (i.e., $O(h^{m+1})$, where $h$ is the fill distance and $m$ is the degree of the local polynomial approximation). The method presented here assumes no analytic knowledge of the approximated manifold and the approximation algorithm is linear in the large dimension $n$. Furthermore, the approximating manifold can serve as a framework to perform operations directly on the high dimensional data in a computationally efficient manner. This way, the preparatory step of dimension reduction, which induces distortions to the data, can be avoided altogether.
We study a constrained optimal control problem for an ensemble of control systems. Each sub-system (or plant) evolves on a matrix Lie group, and must satisfy given state and control action constraints pointwise in time. In addition, certain multiplex ing requirement is imposed: the controller must be shared between the plants in the sense that at any time instant the control signal may be sent to only one plant. We provide first-order necessary conditions for optimality in the form of suitable Pontryagin maximum principle in this problem. Detailed numerical experiments are presented for a system of two satellites performing energy optimal maneuvers under the preceding family of constraints.
The alternating least squares algorithm for CP and Tucker decomposition is dominated in cost by the tensor contractions necessary to set up the quadratic optimization subproblems. We introduce a novel family of algorithms that uses perturbative corre ctions to the subproblems rather than recomputing the tensor contractions. This approximation is accurate when the factor matrices are changing little across iterations, which occurs when alternating least squares approaches convergence. We provide a theoretical analysis to bound the approximation error. Our numerical experiments demonstrate that the proposed pairwise perturbation algorithms are easy to control and converge to minima that are as good as alternating least squares. The experimental results show improvements of up to 3.1X with respect to state-of-the-art alternating least squares approaches for various model tensor problems and real datasets.
194 - Alnur Ali , Edgar Dobriban , 2020
We study the implicit regularization of mini-batch stochastic gradient descent, when applied to the fundamental problem of least squares regression. We leverage a continuous-time stochastic differential equation having the same moments as stochastic gradient descent, which we call stochastic gradient flow. We give a bound on the excess risk of stochastic gradient flow at time $t$, over ridge regression with tuning parameter $lambda = 1/t$. The bound may be computed from explicit constants (e.g., the mini-batch size, step size, number of iterations), revealing precisely how these quantities drive the excess risk. Numerical examples show the bound can be small, indicating a tight relationship between the two estimators. We give a similar result relating the coefficients of stochastic gradient flow and ridge. These results hold under no conditions on the data matrix $X$, and across the entire optimization path (not just at convergence).
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا