Thermodynamic Machine Learning Through Maximum Work Production



Alec Boyd, University of California at Davis


2020.09.22 10:00-11:00





Meeting ID: 965-028-77134

Password: 980552


Adaptive thermodynamic systems—such as a biological organism attempting to gain survival advantage, an autonomous robot performing a functional task, or a motor protein transporting intracellular nutrients—can improve their performance by effectively modeling the regularities and stochasticity in their environments. Analogously, but in a purely computational realm, machine learning algorithms seek to estimate models that capture predictable structure and identify irrelevant noise in training data by optimizing performance measures, such as a model’s log-likelihood of having generated the data. Is there a sense in which these computational models are physically preferred? For adaptive physical systems we introduce the organizing principle that thermodynamic work is the most relevant performance measure of advantageously modeling an environment. Specifically, a physical agent’s model determines how much useful work it can harvest from an environment. We show that when such agents maximize work production they also maximize their environmental model’s log-likelihood, establishing an equivalence between thermodynamics and learning. In this way, work maximization appears as an organizing principle that underlies learning in adaptive thermodynamic systems.


After completing his PhD in physics with Prof. James P. Crutchfield at UC Davis, Alec Boyd was awarded the Templeton World Charity Foundation independent research fellowship in the Power of Information. Under this fellowship, Alec studied thermodynamics of complex information processing with Prof. Mile Gu at the Complexity Institute in Nanyang Technological University. He is now starting an appointment at the California Institute of Technology to collaborate with Prof. Michael Roukes in experimental tests of the thermodynamics of complexity at the nanoscale.