Seminars and Events

Artificial Intelligence Seminar

Learning to Continually Learn

Event Details

A dominant trend in machine learning is that hand-designed pipelines are replaced by higher-performing learned pipelines once sufficient compute and data are available. I argue that trend will apply to machine learning itself, and thus that the fastest path to truly powerful AI is to create AI-generating algorithms (AI-GAs) that on their own *learn* to solve the hardest AI problems. This paradigm is an all-in bet on meta-learning. After introducing these ideas, the talk focuses on one example of this paradigm: Learning to Continually Learn. Catastrophic forgetting is a longstanding Achilles Heel of machine learning, wherein our systems learn new tasks by overwriting their knowledge of how to solve previous tasks. To produce agents that can continually learn, we must prevent catastrophic forgetting. I will describe a Neuromodulated Meta-Learning algorithm (ANML), which uses meta-learning to try to solve catastrophic forgetting, producing state-of-the-art results.

Speaker Bio

Jeff Clune is an associate professor of computer science at the University of British Columbia and a research manager at OpenAI. Before that, he was a Senior Research Manager and founding member of Uber AI Labs, which was formed after Uber acquired a startup he helped lead. Jeff focuses on deep learning, deep reinforcement learning, and robotics. Prior to Uber, he was the Loy and Edith Harris Associate Professor in Computer Science at the University of Wyoming. Before that, he was a Research Scientist at Cornell University and received degrees from Michigan State University (PhD, master’s) and the University of Michigan (bachelor’s). More on Jeff’s research can be found at JeffClune.com or on Twitter (@jeffclune).