ترغب بنشر مسار تعليمي؟ اضغط هنا

On the stability of persistent entropy and new summary functions for TDA

49   0   0.0 ( 0 )
 نشر من قبل Manuel Soriano-Trigueros
 تاريخ النشر 2018
  مجال البحث الهندسة المعلوماتية
والبحث باللغة English




اسأل ChatGPT حول البحث

Persistent homology and persistent entropy have recently become useful tools for patter recognition. In this paper, we find requirements under which persistent entropy is stable to small perturbations in the input data and scale invariant. In addition, we describe two new stable summary functions combining persistent entropy and the Betti curve. Finally, we use the previously defined summary functions in a material classification task to show their usefulness in machine learning and pattern recognition.

قيم البحث

اقرأ أيضاً

Using a sharp version of the reverse Young inequality, and a Renyi entropy comparison result due to Fradelizi, Madiman, and Wang, the authors are able to derive Renyi entropy power inequalities for log-concave random vectors when Renyi parameters bel ong to $(0,1)$. Furthermore, the estimates are shown to be sharp up to absolute constants.
In 2020, Budaghyan, Helleseth and Kaleyski [IEEE TIT 66(11): 7081-7087, 2020] considered an infinite family of quadrinomials over $mathbb{F}_{2^{n}}$ of the form $x^3+a(x^{2^s+1})^{2^k}+bx^{3cdot 2^m}+c(x^{2^{s+m}+2^m})^{2^k}$, where $n=2m$ with $m$ odd. They proved that such kind of quadrinomials can provide new almost perfect nonlinear (APN) functions when $gcd(3,m)=1$, $ k=0 $, and $(s,a,b,c)=(m-2,omega, omega^2,1)$ or $((m-2)^{-1}~{rm mod}~n,omega, omega^2,1)$ in which $omegainmathbb{F}_4setminus mathbb{F}_2$. By taking $a=omega$ and $b=c=omega^2$, we observe that such kind of quadrinomials can be rewritten as $a {rm Tr}^{n}_{m}(bx^3)+a^q{rm Tr}^{n}_{m}(cx^{2^s+1})$, where $q=2^m$ and $ {rm Tr}^n_{m}(x)=x+x^{2^m} $ for $ n=2m$. Inspired by the quadrinomials and our observation, in this paper we study a class of functions with the form $f(x)=a{rm Tr}^{n}_{m}(F(x))+a^q{rm Tr}^{n}_{m}(G(x))$ and determine the APN-ness of this new kind of functions, where $a in mathbb{F}_{2^n} $ such that $ a+a^q eq 0$, and both $F$ and $G$ are quadratic functions over $mathbb{F}_{2^n}$. We first obtain a characterization of the conditions for $f(x)$ such that $f(x) $ is an APN function. With the help of this characterization, we obtain an infinite family of APN functions for $ n=2m $ with $m$ being an odd positive integer: $ f(x)=a{rm Tr}^{n}_{m}(bx^3)+a^q{rm Tr}^{n}_{m}(b^3x^9) $, where $ ain mathbb{F}_{2^n}$ such that $ a+a^q eq 0 $ and $ b $ is a non-cube in $ mathbb{F}_{2^n} $.
Compressive sensing relies on the sparse prior imposed on the signal of interest to solve the ill-posed recovery problem in an under-determined linear system. The objective function used to enforce the sparse prior information should be both effectiv e and easily optimizable. Motivated by the entropy concept from information theory, in this paper we propose the generalized Shannon entropy function and R{e}nyi entropy function of the signal as the sparsity promoting regularizers. Both entropy functions are nonconvex, non-separable. Their local minimums only occur on the boundaries of the orthants in the Euclidean space. Compared to other popular objective functions, minimizing the generalized entropy functions adaptively promotes multiple high-energy coefficients while suppressing the rest low-energy coefficients. The corresponding optimization problems can be recasted into a series of reweighted $l_1$-norm minimization problems and then solved efficiently by adapting the FISTA. Sparse signal recovery experiments on both the simulated and real data show the proposed entropy functions minimization approaches perform better than other popular approaches and achieve state-of-the-art performances.
The paper establishes the equality condition in the I-MMSE proof of the entropy power inequality (EPI). This is done by establishing an exact expression for the deficit between the two sides of the EPI. Interestingly, a necessary condition for the eq uality is established by making a connection to the famous Cauchy functional equation.
77 - Eshed Ram , Igal Sason 2016
This paper gives improved R{e}nyi entropy power inequalities (R-EPIs). Consider a sum $S_n = sum_{k=1}^n X_k$ of $n$ independent continuous random vectors taking values on $mathbb{R}^d$, and let $alpha in [1, infty]$. An R-EPI provides a lower bound on the order-$alpha$ Renyi entropy power of $S_n$ that, up to a multiplicative constant (which may depend in general on $n, alpha, d$), is equal to the sum of the order-$alpha$ Renyi entropy powers of the $n$ random vectors ${X_k}_{k=1}^n$. For $alpha=1$, the R-EPI coincides with the well-known entropy power inequality by Shannon. The first improved R-EPI is obtained by tightening the recent R-EPI by Bobkov and Chistyakov which relies on the sharpened Youngs inequality. A further improvement of the R-EPI also relies on convex optimization and results on rank-one modification of a real-valued diagonal matrix.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا