No Arabic abstract
We extend the work by Mastroianni and Szabados regarding the barycentric interpolant introduced by J.-P. Berrut in 1988, for equally spaced nodes. We prove fully their first conjecture and present a proof of a weaker version of their second conjecture. More importantly than proving these conjectures, we present a sharp description of the asymptotic error incurred by the interpolants when the derivative of the interpolated function is absolutely continuous, which is a class of functions broad enough to cover most functions usually found in practice. We also contribute to the solution of the broad problem they raised regarding the order of approximation of these interpolants, by showing that they have order of approximation of order 1/n for functions with derivatives of bounded variation.
We prove a conjecture of Ambrus, Ball and Erd{e}lyi that equally spaced points maximize the minimum of discrete potentials on the unit circle whenever the potential is of the form sum_{k=1}^n f(d(z,z_k)), where $f:[0,pi]to [0,infty]$ is non-increasing and strictly convex and $d(z,w)$ denotes the geodesic distance between $z$ and $w$ on the circle.
The Gaver-Stehfest algorithm is widely used for numerical inversion of Laplace transform. In this paper we provide the first rigorous study of the rate of convergence of the Gaver-Stehfest algorithm. We prove that Gaver-Stehfest approximations converge exponentially fast if the target function is analytic in a neighbourhood of a point and they converge at a rate $o(n^{-k})$ if the target function is $(2k+3)$-times differentiable at a point.
Numerical causal derivative estimators from noisy data are essential for real time applications especially for control applications or fluid simulation so as to address the new paradigms in solid modeling and video compression. By using an analytical point of view due to Lanczos cite{C. Lanczos} to this causal case, we revisit $n^{th}$ order derivative estimators originally introduced within an algebraic framework by Mboup, Fliess and Join in cite{num,num0}. Thanks to a given noise level $delta$ and a well-suitable integration length window, we show that the derivative estimator error can be $mathcal{O}(delta ^{frac{q+1}{n+1+q}})$ where $q$ is the order of truncation of the Jacobi polynomial series expansion used. This so obtained bound helps us to choose the values of our parameter estimators. We show the efficiency of our method on some examples.
This article studies the problem of approximating functions belonging to a Hilbert space $H_d$ with an isotropic or anisotropic Gaussian reproducing kernel, $$ K_d(bx,bt) = expleft(-sum_{ell=1}^dgamma_ell^2(x_ell-t_ell)^2right) mbox{for all} bx,btinreals^d. $$ The isotropic case corresponds to using the same shape parameters for all coordinates, namely $gamma_ell=gamma>0$ for all $ell$, whereas the anisotropic case corresponds to varying shape parameters $gamma_ell$. We are especially interested in moderate to large $d$.
We consider the geometry relaxation of an isolated point defect embedded in a homogeneous crystalline solid, within an atomistic description. We prove a sharp convergence rate for a periodic supercell approximation with respect to uniform convergence of the discrete strains.