ﻻ يوجد ملخص باللغة العربية
Persistent homology and persistent entropy have recently become useful tools for patter recognition. In this paper, we find requirements under which persistent entropy is stable to small perturbations in the input data and scale invariant. In addition, we describe two new stable summary functions combining persistent entropy and the Betti curve. Finally, we use the previously defined summary functions in a material classification task to show their usefulness in machine learning and pattern recognition.
Using a sharp version of the reverse Young inequality, and a Renyi entropy comparison result due to Fradelizi, Madiman, and Wang, the authors are able to derive Renyi entropy power inequalities for log-concave random vectors when Renyi parameters bel
In 2020, Budaghyan, Helleseth and Kaleyski [IEEE TIT 66(11): 7081-7087, 2020] considered an infinite family of quadrinomials over $mathbb{F}_{2^{n}}$ of the form $x^3+a(x^{2^s+1})^{2^k}+bx^{3cdot 2^m}+c(x^{2^{s+m}+2^m})^{2^k}$, where $n=2m$ with $m$
Compressive sensing relies on the sparse prior imposed on the signal of interest to solve the ill-posed recovery problem in an under-determined linear system. The objective function used to enforce the sparse prior information should be both effectiv
The paper establishes the equality condition in the I-MMSE proof of the entropy power inequality (EPI). This is done by establishing an exact expression for the deficit between the two sides of the EPI. Interestingly, a necessary condition for the eq
This paper gives improved R{e}nyi entropy power inequalities (R-EPIs). Consider a sum $S_n = sum_{k=1}^n X_k$ of $n$ independent continuous random vectors taking values on $mathbb{R}^d$, and let $alpha in [1, infty]$. An R-EPI provides a lower bound