ﻻ يوجد ملخص باللغة العربية
Eluder dimension and information gain are two widely used methods of complexity measures in bandit and reinforcement learning. Eluder dimension was originally proposed as a general complexity measure of function classes, but the common examples of where it is known to be small are function spaces (vector spaces). In these cases, the primary tool to upper bound the eluder dimension is the elliptic potential lemma. Interestingly, the elliptic potential lemma also features prominently in the analysis of linear bandits/reinforcement learning and their nonparametric generalization, the information gain. We show that this is not a coincidence -- eluder dimension and information gain are equivalent in a precise sense for reproducing kernel Hilbert spaces.
We study the relationship between the eluder dimension for a function class and a generalized notion of rank, defined for any monotone activation $sigma : mathbb{R} to mathbb{R}$, which corresponds to the minimal dimension required to represent the c
Let $X$ be a geodesic metric space with $H_1(X)$ uniformly generated. If $X$ has asymptotic dimension one then $X$ is quasi-isometric to an unbounded tree. As a corollary, we show that the asymptotic dimension of the curve graph of a compact, oriente
The purpose of this note is to record a consequence, for general metric spaces, of a recent result of David Bate. We prove the following fact: Let $X$ be a compact metric space of topological dimension $n$. Suppose that the $n$-dimensional Hausdorff
The results of [I. Ojeda, Amer. Math. Monthly, 122, pp 60--64] provides a characterization of Kronecker square roots of matrices in terms of the symmetry and rank of the block vec matrix (rearrangement matrix). In this short note we reformulate the c
Estimators for mutual information are typically biased. However, in the case of the Kozachenko-Leonenko estimator for metric spaces, a type of nearest neighbour estimator, it is possible to calculate the bias explicitly.