Machine Learning at the Flatiron Institute Seminar: Mengye Ren

Date & Time

Title: Lifelong and Human-like Learning in Foundation Models

Abstract: Real-world agents, including humans, learn from online, lifelong experiences. However, today’s foundation models primarily acquire knowledge through offline, iid learning, while relying on in-context learning for most online adaptation. It is crucial to equip foundation models with lifelong and human-like learning abilities to enable more flexible use of AI in real-world applications. In this talk, I will discuss recent works exploring interesting phenomena in foundation models when learning in online, structured environments. Notably, foundation models exhibit anticipatory and semantically-aware memorization and forgetting behaviors. Furthermore, I will introduce a new method that combines pretraining and meta-learning for learning and consolidating new concepts in large language models. This approach has the potential to lead to future foundation models with incremental consolidation and abstraction capabilities.

Advancing Research in Basic Science and MathematicsSubscribe to Flatiron Institute announcements and other foundation updates