In this paper we study bounds for the total variation distance between two second degree polynomials in normal random variables provided that they essentially depend on at least three variables.
Mixtures of high dimensional Gaussian distributions have been studied extensively in statistics and learning theory. While the total variation distance appears naturally in the sample complexity of distribution learning, it is analytically difficult to obtain tight lower bounds for mixtures. Exploiting a connection between total variation distance and the characteristic function of the mixture, we provide fairly tight functional approximations. This enables us to derive new lower bounds on the total variation distance between pairs of two-component Gaussian mixtures that have a shared covariance matrix.
The paper provides an estimate of the total variation distance between distributions of polynomials defined on a space equipped with a logarithmically concave measure in terms of the $L^2$-distance between these polynomials.
We consider a random walk on the hyperoctahedral group $B_n$ generated by the signed permutations of the forms $(i,n)$ and $(-i,n)$ for $1leq ileq n$. We call this the flip-transpose top with random shuffle on $B_n$. We find the spectrum of the transition probability matrix for this shuffle. We prove that the mixing time for this shuffle is of order $nlog n$. We also show that this shuffle exhibits the cutoff phenomenon. In the appendix, we show that a similar random walk on the demihyperoctahedral group $D_n$ also has a cutoff at $left(n-frac{1}{2}right)log n$.
In this paper, we investigate the properties of a random walk on the alternating group $A_n$ generated by $3$-cycles of the form $(i,n-1,n)$ and $(i,n,n-1)$. We call this the transpose top-$2$ with random shuffle. We find the spectrum of the transition matrix of this shuffle. We show that the mixing time is of order $left(n-frac{3}{2}right)log n$ and prove that there is a total variation cutoff for this shuffle.
It is shown that functions defined on ${0,1,...,r-1}^n$ satisfying certain conditions of bounded differences that guarantee sub-Gaussian tail behavior also satisfy a much stronger ``local sub-Gaussian property. For self-bounding and configuration functions we derive analogous locally subexponential behavior. The key tool is Talagrands [Ann. Probab. 22 (1994) 1576--1587] variance inequality for functions defined on the binary hypercube which we extend to functions of uniformly distributed random variables defined on ${0,1,...,r-1}^n$ for $rge2$.