Do you want to publish a course? Click here

Semilocal Convergence Analysis for Two-Step Newton Method under Generalized Lipschitz Conditions in Banach Spaces

127   0   0.0 ( 0 )
 Added by Yonghui Ling
 Publication date 2018
  fields
and research's language is English




Ask ChatGPT about the research

In the present paper, we consider the semilocal convergence problems of the two-step Newton method for solving nonlinear operator equation in Banach spaces. Under the assumption that the first derivative of the operator satisfies a generalized Lipschitz condition, a new semilocal convergence analysis for the two-step Newton method is presented. The Q-cubic convergence is obtained by an additional condition. This analysis also allows us to obtain three important spacial cases about the convergence results based on the premises of Kantorovich, Smale and Nesterov-Nemirovskii types. An application of our convergence results is to the approximation of minimal positive solution for a nonsymmetric algebraic Riccati equation arising from transport theory.



rate research

Read More

We analyze several Galerkin approximations of a Gaussian random field $mathcal{Z}colonmathcal{D}timesOmegatomathbb{R}$ indexed by a Euclidean domain $mathcal{D}subsetmathbb{R}^d$ whose covariance structure is determined by a negative fractional power $L^{-2beta}$ of a second-order elliptic differential operator $L:= - ablacdot(A abla) + kappa^2$. Under minimal assumptions on the domain $mathcal{D}$, the coefficients $Acolonmathcal{D}tomathbb{R}^{dtimes d}$, $kappacolonmathcal{D}tomathbb{R}$, and the fractional exponent $beta>0$, we prove convergence in $L_q(Omega; H^sigma(mathcal{D}))$ and in $L_q(Omega; C^delta(overline{mathcal{D}}))$ at (essentially) optimal rates for (i) spectral Galerkin methods and (ii) finite element approximations. Specifically, our analysis is solely based on $H^{1+alpha}(mathcal{D})$-regularity of the differential operator $L$, where $0<alphaleq 1$. For this setting, we furthermore provide rigorous estimates for the error in the covariance function of these approximations in $L_{infty}(mathcal{D}timesmathcal{D})$ and in the mixed Sobolev space $H^{sigma,sigma}(mathcal{D}timesmathcal{D})$, showing convergence which is more than twice as fast compared to the corresponding $L_q(Omega; H^sigma(mathcal{D}))$-rate. For the well-known example of such Gaussian random fields, the original Whittle-Matern class, where $L=-Delta + kappa^2$ and $kappa equiv operatorname{const.}$, we perform several numerical experiments which validate our theoretical results.
We present a local convergence analysis of inexact Newton-like methods for solving nonlinear equations under majorant conditions. This analysis provides an estimate of the convergence radius and a clear relationship between the majorant function, which relaxes the Lipschitz continuity of the derivative, and the nonlinear operator under consideration. It also allow us to obtain some important special cases
Using deep neural networks to solve PDEs has attracted a lot of attentions recently. However, why the deep learning method works is falling far behind its empirical success. In this paper, we provide a rigorous numerical analysis on deep Ritz method (DRM) cite{wan11} for second order elliptic equations with Neumann boundary conditions. We establish the first nonasymptotic convergence rate in $H^1$ norm for DRM using deep networks with $mathrm{ReLU}^2$ activation functions. In addition to providing a theoretical justification of DRM, our study also shed light on how to set the hyper-parameter of depth and width to achieve the desired convergence rate in terms of number of training samples. Technically, we derive bounds on the approximation error of deep $mathrm{ReLU}^2$ network in $H^1$ norm and on the Rademacher complexity of the non-Lipschitz composition of gradient norm and $mathrm{ReLU}^2$ network, both of which are of independent interest.
We propose and analyse a minimal-residual method in discrete dual norms for approximating the solution of the advection-reaction equation in a weak Banach-space setting. The weak formulation allows for the direct approximation of solutions in the Lebesgue $L^p$-space, $1<p<infty$. The greater generality of this weak setting is natural when dealing with rough data and highly irregular solutions, and when enhanced qualitative features of the approximations are needed. We first present a rigorous analysis of the well-posedness of the underlying continuous weak formulation, under natural assumptions on the advection-reaction coefficients. The main contribution is the study of several discrete subspace pairs guaranteeing the discrete stability of the method and quasi-optimality in $L^p$, and providing numerical illustrations of these findings, including the elimination of Gibbs phenomena, computation of optimal test spaces, and application to 2-D advection.
This paper studies the problem of approximating a function $f$ in a Banach space $X$ from measurements $l_j(f)$, $j=1,dots,m$, where the $l_j$ are linear functionals from $X^*$. Most results study this problem for classical Banach spaces $X$ such as the $L_p$ spaces, $1le ple infty$, and for $K$ the unit ball of a smoothness space in $X$. Our interest in this paper is in the model classes $K=K(epsilon,V)$, with $epsilon>0$ and $V$ a finite dimensional subspace of $X$, which consists of all $fin X$ such that $dist(f,V)_Xle epsilon$. These model classes, called {it approximation sets}, arise naturally in application domains such as parametric partial differential equations, uncertainty quantification, and signal processing. A general theory for the recovery of approximation sets in a Banach space is given. This theory includes tight a priori bounds on optimal performance, and algorithms for finding near optimal approximations. We show how the recovery problem for approximation sets is connected with well-studied concepts in Banach space theory such as liftings and the angle between spaces. Examples are given that show how this theory can be used to recover several recent results on sampling and data assimilation.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا