ترغب بنشر مسار تعليمي؟ اضغط هنا

In this paper we solve support vector machines in reproducing kernel Banach spaces with reproducing kernels defined on nonsymmetric domains instead of the traditional methods in reproducing kernel Hilbert spaces. Using the orthogonality of semi-inner -products, we can obtain the explicit representations of the dual (normalized-duality-mapping) elements of support vector machine solutions. In addition, we can introduce the reproduction property in a generalized native space by Fourier transform techniques such that it becomes a reproducing kernel Banach space, which can be even embedded into Sobolev spaces, and its reproducing kernel is set up by the related positive definite function. The representations of the optimal solutions of support vector machines (regularized empirical risks) in these reproducing kernel Banach spaces are formulated explicitly in terms of positive definite functions, and their finite numbers of coefficients can be computed by fixed point iteration. We also give some typical examples of reproducing kernel Banach spaces induced by Matern functions (Sobolev splines) so that their support vector machine solutions are well computable as the classical algorithms. Moreover, each of their reproducing bases includes information from multiple training data points. The concept of reproducing kernel Banach spaces offers us a new numerical tool for solving support vector machines.
In this paper we introduce a generalized Sobolev space by defining a semi-inner product formulated in terms of a vector distributional operator $mathbf{P}$ consisting of finitely or countably many distributional operators $P_n$, which are defined on the dual space of the Schwartz space. The types of operators we consider include not only differential operators, but also more general distributional operators such as pseudo-differential operators. We deduce that a certain appropriate full-space Green function $G$ with respect to $L:=mathbf{P}^{ast T}mathbf{P}$ now becomes a conditionally positive definite function. In order to support this claim we ensure that the distributional adjoint operator $mathbf{P}^{ast}$ of $mathbf{P}$ is well-defined in the distributional sense. Under sufficient conditions, the native space (reproducing-kernel Hilbert space) associated with the Green function $G$ can be isometrically embedded into or even be isometrically equivalent to a generalized Sobolev space. As an application, we take linear combinations of translates of the Green function with possibly added polynomial terms and construct a multivariate minimum-norm interpolant $s_{f,X}$ to data values sampled from an unknown generalized Sobolev function $f$ at data sites located in some set $X subset mathbb{R}^d$. We provide several examples, such as Matern kernels or Gaussian kernels, that illustrate how many reproducing-kernel Hilbert spaces of well-known reproducing kernels are isometrically equivalent to a generalized Sobolev space. These examples further illustrate how we can rescale the Sobolev spaces by the vector distributional operator $mathbf{P}$. Introducing the notion of scale as part of the definition of a generalized Sobolev space may help us to choose the best kernel function for kernel-based approximation methods.
We introduce a vector differential operator $mathbf{P}$ and a vector boundary operator $mathbf{B}$ to derive a reproducing kernel along with its associated Hilbert space which is shown to be embedded in a classical Sobolev space. This reproducing ker nel is a Green kernel of differential operator $L:=mathbf{P}^{ast T}mathbf{P}$ with homogeneous or nonhomogeneous boundary conditions given by $mathbf{B}$, where we ensure that the distributional adjoint operator $mathbf{P}^{ast}$ of $mathbf{P}$ is well-defined in the distributional sense. We represent the inner product of the reproducing-kernel Hilbert space in terms of the operators $mathbf{P}$ and $mathbf{B}$. In addition, we find relationships for the eigenfunctions and eigenvalues of the reproducing kernel and the operators with homogeneous or nonhomogeneous boundary conditions. These eigenfunctions and eigenvalues are used to compute a series expansion of the reproducing kernel and an orthonormal basis of the reproducing-kernel Hilbert space. Our theoretical results provide perhaps a more intuitive way of understanding what kind of functions are well approximated by the reproducing kernel-based interpolant to a given multivariate data sample.
In this paper we present the theoretical framework needed to justify the use of a kernel-based collocation method (meshfree approximation method) to estimate the solution of high-dimensional stochastic partial differential equations (SPDEs). Using an implicit time stepping scheme, we transform stochastic parabolic equations into stochastic elliptic equations. Our main attention is concentrated on the numerical solution of the elliptic equations at each time step. The estimator of the solution of the elliptic equations is given as a linear combination of reproducing kernels derived from the differential and boundary operators of the SPDE centered at collocation points to be chosen by the user. The random expansion coefficients are computed by solving a random system of linear equations. Numerical experiments demonstrate the feasibility of the method.
This article studies the problem of approximating functions belonging to a Hilbert space $H_d$ with an isotropic or anisotropic Gaussian reproducing kernel, $$ K_d(bx,bt) = expleft(-sum_{ell=1}^dgamma_ell^2(x_ell-t_ell)^2right) mbox{for all} bx,b tinreals^d. $$ The isotropic case corresponds to using the same shape parameters for all coordinates, namely $gamma_ell=gamma>0$ for all $ell$, whereas the anisotropic case corresponds to varying shape parameters $gamma_ell$. We are especially interested in moderate to large $d$.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا