Continual Learning with Differential Privacy

Pradnya Desai, Phung Lai, Nhat Hai Phan, My T. Thai

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Scopus citations


In this paper, we focus on preserving differential privacy (DP) in continual learning (CL), in which we train ML models to learn a sequence of new tasks while memorizing previous tasks. We first introduce a notion of continual adjacent databases to bound the sensitivity of any data record participating in the training process of CL. Based upon that, we develop a new DP-preserving algorithm for CL with a data sampling strategy to quantify the privacy risk of training data in the well-known Averaged Gradient Episodic Memory (A-GEM) approach by applying a moments accountant. Our algorithm provides formal guarantees of privacy for data records across tasks in CL. Preliminary theoretical analysis and evaluations show that our mechanism tightens the privacy loss while maintaining a promising model utility.

Original languageEnglish (US)
Title of host publicationNeural Information Processing - 28th International Conference, ICONIP 2021, Proceedings
EditorsTeddy Mantoro, Minho Lee, Media Anugerah Ayu, Kok Wai Wong, Achmad Nizar Hidayanto
PublisherSpringer Science and Business Media Deutschland GmbH
Number of pages10
ISBN (Print)9783030923099
StatePublished - 2021
Event28th International Conference on Neural Information Processing, ICONIP 2021 - Virtual, Online
Duration: Dec 8 2021Dec 12 2021

Publication series

NameCommunications in Computer and Information Science
Volume1517 CCIS
ISSN (Print)1865-0929
ISSN (Electronic)1865-0937


Conference28th International Conference on Neural Information Processing, ICONIP 2021
CityVirtual, Online

All Science Journal Classification (ASJC) codes

  • General Computer Science
  • General Mathematics


  • Continual learning
  • Deep learning
  • Differential privacy


Dive into the research topics of 'Continual Learning with Differential Privacy'. Together they form a unique fingerprint.

Cite this