Loading…

Quantifying many-body learning far from equilibrium with representation learning

Far-from-equilibrium many-body systems, from soap bubbles to suspensions to polymers, learn the drives that push them. This learning has been observed via thermodynamic properties, such as work absorption and strain. We move beyond these macroscopic properties that were first defined for equilibrium...

Full description

Saved in:
Bibliographic Details
Published in:arXiv.org 2020-04
Main Authors: Zhong, Weishun, Gold, Jacob M, Marzen, Sarah, England, Jeremy L, Nicole Yunger Halpern
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Far-from-equilibrium many-body systems, from soap bubbles to suspensions to polymers, learn the drives that push them. This learning has been observed via thermodynamic properties, such as work absorption and strain. We move beyond these macroscopic properties that were first defined for equilibrium contexts: We quantify statistical mechanical learning with machine learning. Our toolkit relies on a structural parallel that we identify between far-from-equilibrium statistical mechanics and representation learning, which is undergone by neural networks that contain bottlenecks, including variational autoencoders. We train a variational autoencoder, via unsupervised learning, on configurations assumed by a many-body system during strong driving. We analyze the neural network's bottleneck to measure the many-body system's classification ability, memory capacity, discrimination ability, and novelty detection. Numerical simulations of a spin glass illustrate our technique. This toolkit exposes self-organization that eludes detection by thermodynamic measures, more reliably and more precisely identifying and quantifying learning by matter.
ISSN:2331-8422