Learning about learning by many-body systems


الملخص بالإنكليزية

Many-body systems from soap bubbles to suspensions to polymers learn the drives that push them far from equilibrium. This learning has been detected with thermodynamic properties, such as work absorption and strain. We progress beyond these macroscopic properties that were first defined for equilibrium contexts: We quantify statistical mechanical learning with representation learning, a machine-learning model in which information squeezes through a bottleneck. We identify a structural parallel between representation learning and far-from-equilibrium statistical mechanics. Applying this parallel, we measure four facets of many-body systems learning: classification ability, memory capacity, discrimination ability, and novelty detection. Numerical simulations of a classical spin glass illustrate our technique. This toolkit exposes self-organization that eludes detection by thermodynamic measures. Our toolkit more reliably and more precisely detects and quantifies learning by matter.

تحميل البحث