Do you want to publish a course? Click here

On multilinear distorted multiplier estimate and its applications

54   0   0.0 ( 0 )
 Added by Kailong Yang
 Publication date 2021
  fields
and research's language is English
 Authors Kailong Yang




Ask ChatGPT about the research

In this article, we investigate the multilinear distorted multiplier estimate (Coifman-Meyer type theorem) associated with the Schr{o}dinger operator $H=-Delta + V$ in the framework of the corresponding distorted Fourier transform. Our result is the distorted analog of the multilinear Coifman-Meyer multiplier operator theorem in cite{CM1}, which extends the bilinear estimates of Germain, Hani and Walshs in cite{PZS} to the multilinear case for all dimensions. As applications, we give the estimate of Leibnizs law of integer order derivations for the multilinear distorted multiplier for the first time and we obtain small data scattering for a kind of generalized mass-critical NLS with good potential in low dimensions $d=1,2$.



rate research

Read More

We give an estimate of the general divided differences $[x_0,dots,x_m;f]$, where some of the $x_i$s are allowed to coalesce (in which case, $f$ is assumed to be sufficiently smooth). This estimate is then applied to significantly strengthen Whitney and Marchaud celebrated inequalities in relation to Hermite interpolation. For example, one of the numerous corollaries of this estimate is the fact that, given a function $fin C^{(r)}(I)$ and a set $Z={z_j}_{j=0}^mu$ such that $z_{j+1}-z_j geq lambda |I|$, for all $0le j le mu-1$, where $I:=[z_0, z_mu]$, $|I|$ is the length of $I$ and $lambda$ is some positive number, the Hermite polynomial ${mathcal L}(cdot;f;Z)$ of degree $le rmu+mu+r$ satisfying ${mathcal L}^{(j)}(z_ u; f;Z) = f^{(j)}(z_ u)$, for all $0le u le mu$ and $0le jle r$, approximates $f$ so that, for all $xin I$, [ big|f(x)- {mathcal L}(x;f;Z) big| le C left( mathop{rm dist} olimits(x, Z) right)^{r+1} int_{mathop{rm dist} olimits(x, Z)}^{2|I|}frac{omega_{m-r}(f^{(r)},t,I)}{t^2}dt , ] where $m :=(r+1)(mu+1)$, $C=C(m, lambda)$ and $mathop{rm dist} olimits(x, Z) := min_{0le j le mu} |x-z_j|$.
47 - Tuoc Phan 2017
This paper establishes global weighted Calderon-Zygmund type regularity estimates for weak solutions of a class of generalized Stokes systems in divergence form. The focus of the paper is on the case that the coefficients in the divergence-form Stokes operator consist of symmetric and skew-symmetric parts, which are both discontinuous. Moreover, the skew-symmetric part is not required to be bounded and therefore it could be singular. Sufficient conditions on the coefficients are provided to ensure the global weighted $W^{1,p}$-regularity estimates for weak solutions of the systems. As a direct application, we show that our $W^{1,p}$-regularity results give some criteria in critical spaces for the global regularity of weak Leray-Hopf solutions of the Navier-Stokes system of equation
126 - Suyu Li , Meijun Zhu 2007
We establish an analog Hardy inequality with sharp constant involving exponential weight function. The special case of this inequality (for n=2) leads to a direct proof of Onofri inequality on S^2.
125 - Ramin Okhrati , Aldo Lipani 2020
Shapley values are great analytical tools in game theory to measure the importance of a player in a game. Due to their axiomatic and desirable properties such as efficiency, they have become popular for feature importance analysis in data science and machine learning. However, the time complexity to compute Shapley values based on the original formula is exponential, and as the number of features increases, this becomes infeasible. Castro et al. [1] developed a sampling algorithm, to estimate Shapley values. In this work, we propose a new sampling method based on a multilinear extension technique as applied in game theory. The aim is to provide a more efficient (sampling) method for estimating Shapley values. Our method is applicable to any machine learning model, in particular for either multi-class classifications or regression problems. We apply the method to estimate Shapley values for multilayer perceptrons (MLPs) and through experimentation on two datasets, we demonstrate that our method provides more accurate estimations of the Shapley values by reducing the variance of the sampling statistics.
70 - Qiyang Han 2021
The theory for multiplier empirical processes has been one of the central topics in the development of the classical theory of empirical processes, due to its wide applicability to various statistical problems. In this paper, we develop theory and tools for studying multiplier $U$-processes, a natural higher-order generalization of the multiplier empirical processes. To this end, we develop a multiplier inequality that quantifies the moduli of continuity of the multiplier $U$-process in terms of that of the (decoupled) symmetrized $U$-process. The new inequality finds a variety of applications including (i) multiplier and bootstrap central limit theorems for $U$-processes, (ii) general theory for bootstrap $M$-estimators based on $U$-statistics, and (iii) theory for $M$-estimation under general complex sampling designs, again based on $U$-statistics.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا