ترغب بنشر مسار تعليمي؟ اضغط هنا

Sharp Lipschitz constants for the distance ratio metric

72   0   0.0 ( 0 )
 نشر من قبل Gendi Wang
 تاريخ النشر 2012
  مجال البحث
والبحث باللغة English




اسأل ChatGPT حول البحث

We study expansion/contraction properties of some common classes of mappings of the Euclidean space ${mathbb R}^n, nge 2,,$ with respect to the distance ratio metric. The first main case is the behavior of Mobius transformations of the unit ball in ${mathbb R}^n$ onto itself. In the second main case we study the polynomials of the unit disk onto a subdomain of the complex plane. In both cases sharp Lipschitz constants are obtained.

قيم البحث

اقرأ أيضاً

We give study the Lipschitz continuity of Mobius transformations of a punctured disk onto another punctured disk with respect to the distance ratio metric.
257 - Y. Li , S. Ponnusamy , 2014
Suppose that $E$ and $E$ denote real Banach spaces with dimension at least $2$ and that $Dsubset E$ and $Dsubset E$ are domains. In this paper, we establish, in terms of the $j_D$ metric, a necessary and sufficient condition for the homeomorphism $f: E to E$ to be FQC. Moreover, we give, in terms of the $j_D$ metric, a sufficient condition for the homeomorphism $f: Dto D$ to be FQC. On the other hand, we show that this condition is not necessary.
The Bohr radius for a class $mathcal{G}$ consisting of analytic functions $f(z)=sum_{n=0}^{infty}a_nz^n$ in unit disc $mathbb{D}={zinmathbb{C}:|z|<1}$ is the largest $r^*$ such that every function $f$ in the class $mathcal{G}$ satisfies the inequalit y begin{equation*} dleft(sum_{n=0}^{infty}|a_nz^n|, |f(0)|right) = sum_{n=1}^{infty}|a_nz^n|leq d(f(0), partial f(mathbb{D})) end{equation*} for all $|z|=r leq r^*$, where $d$ is the Euclidean distance. In this paper, our aim is to determine the Bohr radius for the classes of analytic functions $f$ satisfying differential subordination relations $zf(z)/f(z) prec h(z)$ and $f(z)+beta z f(z)+gamma z^2 f(z)prec h(z)$, where $h$ is the Janowski function. Analogous results are obtained for the classes of $alpha$-convex functions and typically real functions, respectively. All obtained results are sharp.
Deep metric learning, which learns discriminative features to process image clustering and retrieval tasks, has attracted extensive attention in recent years. A number of deep metric learning methods, which ensure that similar examples are mapped clo se to each other and dissimilar examples are mapped farther apart, have been proposed to construct effective structures for loss functions and have shown promising results. In this paper, different from the approaches on learning the loss structures, we propose a robust SNR distance metric based on Signal-to-Noise Ratio (SNR) for measuring the similarity of image pairs for deep metric learning. By exploring the properties of our SNR distance metric from the view of geometry space and statistical theory, we analyze the properties of our metric and show that it can preserve the semantic similarity between image pairs, which well justify its suitability for deep metric learning. Compared with Euclidean distance metric, our SNR distance metric can further jointly reduce the intra-class distances and enlarge the inter-class distances for learned features. Leveraging our SNR distance metric, we propose Deep SNR-based Metric Learning (DSML) to generate discriminative feature embeddings. By extensive experiments on three widely adopted benchmarks, including CARS196, CUB200-2011 and CIFAR10, our DSML has shown its superiority over other state-of-the-art methods. Additionally, we extend our SNR distance metric to deep hashing learning, and conduct experiments on two benchmarks, including CIFAR10 and NUS-WIDE, to demonstrate the effectiveness and generality of our SNR distance metric.
We analyze stability of conservative solutions of the Cauchy problem on the line for the (integrated) Hunter-Saxton (HS) equation. Generically, the solutions of the HS equation develop singularities with steep gradients while preserving continuity of the solution itself. In order to obtain uniqueness, one is required to augment the equation itself by a measure that represents the associated energy, and the breakdown of the solution is associated with a complicated interplay where the measure becomes singular. The main result in this paper is the construction of a Lipschitz metric that compares two solutions of the HS equation with the respective initial data. The Lipschitz metric is based on the use of the Wasserstein metric.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا