Catastrophic Forgetting in Continual Concept Bottleneck Models
Sep 5, 2022·
,,,,·
0 min read
Emanuele Marconato

Bontempo Gianpaolo
Stefano Teso
Elisa Ficarra
Simone Calderara
Andrea Passerini
Abstract
Almost all Deep Learning models are dramatically affected by Catastrophic Forgetting when learning over continual streams of data. To mitigate this problem, several strategies for Continual Learning have been proposed, even though the extent of the forgetting is still unclear. In this paper, we analyze Concept Bottleneck (CB) models in the Continual Learning setting and we investigate the effect of high-level features supervision on Catastrophic Forgetting at the representation layer. Consequently, we introduce two different metrics to evaluate the loss of information on the learned concepts as new experiences are encountered. We also show that the obtained Saliency maps remain more stable with the attributes supervision.
Type
Publication
In Image Analysis and Processing 2023