Continual Learning A Deep Dive Into Elastic Weight Consolidation Loss – Towards Data Science

One of the most significant challenges in training artificial neural networks is catastrophic forgetting. This problem arises when a neural network trained on one task (Task A) subsequently learns a new task (Task B) and, in the process, forgets how to perform the original task. In this article, we will explore a method to address this issue known as Elastic Weight Consolidation (EWC). EWC offers a promising approach to mitigate catastrophic forgetting enabling neural networks to retain knowledge of previously learned tasks while acquiring new skills.

All figures in this article are by author unless otherwise specified

It has been shown that there exist many configurations of optimal parameters with a desired low error on a task gray and yellow regions for tasks A and B respectively in the above figure. Assuming we found one such configuration * for task A, when continuing to train the model from such configuration to a new task B we have three different scenarios:

Read more:

Continual Learning A Deep Dive Into Elastic Weight Consolidation Loss - Towards Data Science

Related Posts

Comments are closed.