Machine Learning through the Information Bottleneck

When:
Friday, February 7, 2020, 11:00 am - 12:00 pm PSTiCal
Where:
10th Floor-CR#1016
This event is open to the public.
Type:
NL Seminar
Speaker:
Artemy Kolchinsky (Santa Fe Institute)
Video Recording:
https://bluejeans.com/298422226
Description:

Abstract: The information bottleneck (IB) has been proposed as a principled way to compress a random variable, while only preserving that information which is relevant for predicting another random variable. In recent times, the IB has been proposed --- and challenged --- as a theoretical framework for understanding why and how deep learning architectures achieve  good performance. I will cover: (1) an introduction to the ideas behind IB, (2) methods for implementing information-theoretic compression in neural networks + some possible applications of such methods, (3) the current status of the IB theory of deep learning, (4) recently discovered caveats that arise for IB in machine learning scenarios.

Biography: Artemy Kolchinsky is a postdoctoral fellow at the Santa Fe Institute (Santa Fe, NM). His work lies at the intersection of information theory, statistical physics, and machine learning. He is interested in using tools from statistical physics to derive fundamental bounds on the ability of real-world agents -- whether protocells, organisms, or computers -- to acquire and exploit information in adaptive ways.

« Return to Upcoming Events