ﻻ يوجد ملخص باللغة العربية
The problem of constructing pseudorandom generators that fool halfspaces has been studied intensively in recent times. For fooling halfspaces over the hypercube with polynomially small error, the best construction known requires seed-length O(log^2 n) (MekaZ13). Getting the seed-length down to O(log(n)) is a natural challenge in its own right, which needs to be overcome in order to derandomize RL. In this work we make progress towards this goal by obtaining near-optimal generators for two important special cases: 1) We give a near optimal derandomization of the Chernoff bound for independent, uniformly random bits. Specifically, we show how to generate a x in {1,-1}^n using $tilde{O}(log (n/epsilon))$ random bits such that for any unit vector u, <u,x> matches the sub-Gaussian tail behaviour predicted by the Chernoff bound up to error eps. 2) We construct a generator which fools halfspaces with {0,1,-1} coefficients with error eps with a seed-length of $tilde{O}(log(n/epsilon))$. This includes the important special case of majorities. In both cases, the best previous results required seed-length of $O(log n + log^2(1/epsilon))$. Technically, our work combines new Fourier-analytic tools with the iterative dimension reduction techniques and the gradually increasing independence paradigm of previous works (KaneMN11, CelisRSW13, GopalanMRTV12).
In 1992 Mansour proved that every size-$s$ DNF formula is Fourier-concentrated on $s^{O(loglog s)}$ coefficients. We improve this to $s^{O(loglog k)}$ where $k$ is the read number of the DNF. Since $k$ is always at most $s$, our bound matches Mansour
Higher order random walks (HD-walks) on high dimensional expanders (HDX) have seen an incredible amount of study and application since their introduction by Kaufman and Mass [KM16], yet their broader combinatorial and spectral properties remain poorl
Finding cliques in random graphs and the closely related planted clique variant, where a clique of size t is planted in a random G(n,1/2) graph, have been the focus of substantial study in algorithm design. Despite much effort, the best known polynom
In this note we prove a spectral gap for various Markov chains on various functional spaces. While proving that a spectral gap exists is relatively common, explicit estimates seems somewhat rare.These estimates are then used to apply the concentratio
We develop a notion of {em inner rank} as a tool for obtaining lower bounds on the rank of matrix multiplication tensors. We use it to give a short proof that the border rank (and therefore rank) of the tensor associated with $ntimes n$ matrix multip