MehrdadFarajtabar

Mehrdad Farajtabar

Google DeepMind

Title:Dealing with Catastrophic Forgetting in Continual Learning of Neural Networks
Abstract

Neural networks are achieving state of the art and sometimes superhuman performance on learning tasks across a variety of domains. Whenever these problems require learning in a continual or sequential manner, however, neural networks suffer from the problem of catastrophic forgetting; they forget how to solve previous tasks after being trained on a new task, despite having the essential capacity to solve both tasks if they were trained on both simultaneously. In this talk, we introduce this phenomena and propose a few methods to address this issue from a variety of perspectives. From regularization in the parameter space to optimization and loss landscape perspectives.

Bio

Mehrdad Farajtabar is a research scientist in Google DeepMind working on machine learning and applications. His recent research interests are continual learning of neural networks, learning under eveloing data distributions and reinforcement learning. Before joining DeepMind he graduated with PhD in computational science and engineering from Georgia Tech in 2018 and holds M.Sc. and B.Sc. degrees in Artificial Intelligence and Software Engineering from Sharif University of Technology.