Continual Learning Objective for Analyzing Complex Knowledge Representations

Asad Mansoor Khan, Taimur Hassan, Muhammad Usman Akram, Norah Saleh Alghamdi, Naoufel Werghi

Research output: Contribution to journalArticlepeer-review

16 Scopus citations


Human beings tend to incrementally learn from the rapidly changing environment without comprising or forgetting the already learned representations. Although deep learning also has the potential to mimic such human behaviors to some extent, it suffers from catastrophic forgetting due to which its performance on already learned tasks drastically decreases while learning about newer knowledge. Many researchers have proposed promising solutions to eliminate such catastrophic forgetting during the knowledge distillation process. However, to our best knowledge, there is no literature available to date that exploits the complex relationships between these solutions and utilizes them for the effective learning that spans over multiple datasets and even multiple domains. In this paper, we propose a continual learning objective that encompasses mutual distillation loss to understand such complex relationships and allows deep learning models to effectively retain the prior knowledge while adapting to the new classes, new datasets, and even new applications. The proposed objective was rigorously tested on nine publicly available, multi-vendor, and multimodal datasets that span over three applications, and it achieved the top-1 accuracy of 0.9863% and an F1-score of 0.9930.

Original languageBritish English
Article number1667
JournalSensors (Switzerland)
Issue number4
StatePublished - 1 Feb 2022


  • Catastrophic forgetting
  • Complex knowledge representations
  • Continual learning
  • Multimodal datasets


Dive into the research topics of 'Continual Learning Objective for Analyzing Complex Knowledge Representations'. Together they form a unique fingerprint.

Cite this