ﻻ يوجد ملخص باللغة العربية
Let $T_{epsilon}$ be the noise operator acting on Boolean functions $f:{0, 1}^nto {0, 1}$, where $epsilonin[0, 1/2]$ is the noise parameter. Given $alpha>1$ and fixed mean $mathbb{E} f$, which Boolean function $f$ has the largest $alpha$-th moment $mathbb{E}(T_epsilon f)^alpha$? This question has close connections with noise stability of Boolean functions, the problem of non-interactive correlation distillation, and Courtade-Kumars conjecture on the most informative Boolean function. In this paper, we characterize maximizers in some extremal settings, such as low noise ($epsilon=epsilon(n)$ is close to 0), high noise ($epsilon=epsilon(n)$ is close to 1/2), as well as when $alpha=alpha(n)$ is large. Analogous results are also established in more general contexts, such as Boolean functions defined on discrete torus $(mathbb{Z}/pmathbb{Z})^n$ and the problem of noise stability in a tree model.
We propose a method for learning Markov network structures for continuous data without invoking any assumptions about the distribution of the variables. The method makes use of previous work on a non-parametric estimator for mutual information which
We propose a new estimator to measure directed dependencies in time series. The dimensionality of data is first reduced using a new non-uniform embedding technique, where the variables are ranked according to a weighted sum of the amount of new infor
Conditional Mutual Information (CMI) is a measure of conditional dependence between random variables X and Y, given another random variable Z. It can be used to quantify conditional dependence among variables in many data-driven inference problems su
Mutual information is a widely-used information theoretic measure to quantify the amount of association between variables. It is used extensively in many applications such as image registration, diagnosis of failures in electrical machines, pattern r
We introduce a new information theoretic measure of quantum correlations for multiparticle systems. We use a form of multivariate mutual information -- the interaction information and generalize it to multiparticle quantum systems. There are a number