ﻻ يوجد ملخص باللغة العربية
We investigate the properties of the simultaneous projection method as applied to countably infinitely many closed and linear subspaces of a real Hilbert space. We establish the optimal error bound for linear convergence of this method, which we express in terms of the cosine of the Friedrichs angle computed in an infinite product space. In addition, we provide estimates and alternative expressions for the above-mentioned number. Furthermore, we relate this number to the dichotomy theorem and to super-polynomially fast convergence. We also discuss polynomial convergence of the simultaneous projection method which takes place for particularly chosen starting points.
The paper deals with an eigenvalue problems possessing infinitely many positive and negative eigenvalues. Inequalities for the smallest positive and the largest negative eigenvalues, which have the same properties as the fundamental frequency, are de
Convex hulls of monomials have been widely studied in the literature, and monomial convexifications are implemented in global optimization software for relaxing polynomials. However, there has been no study of the error in the global optimum from suc
This paper investigates optimal error bounds and convergence rates for general Mann iterations for computing fixed-points of non-expansive maps in normed spaces. We look for iterations that achieve the smallest fixed-point residual after $n$ steps, b
We prove that a reduced and irreducible algebraic surface in $mathbb{CP}^{3}$ containing infinitely many twistor lines cannot have odd degree. Then, exploiting the theory of quaternionic slice regularity and the normalization map of a surface, we give constructive existence results for even degrees.
Stochastic Gradient Descent (SGD) plays a central role in modern machine learning. While there is extensive work on providing error upper bound for SGD, not much is known about SGD error lower bound. In this paper, we study the convergence of constan