Abstract
In this paper, we focus on preserving differential privacy (DP) in continual learning (CL), in which we train ML models to learn a sequence of new tasks while memorizing previous tasks. We first introduce a notion of continual adjacent databases to bound the sensitivity of any data record participating in the training process of CL. Based upon that, we develop a new DP-preserving algorithm for CL with a data sampling strategy to quantify the privacy risk of training data in the well-known Averaged Gradient Episodic Memory (A-GEM) approach by applying a moments accountant. Our algorithm provides formal guarantees of privacy for data records across tasks in CL. Preliminary theoretical analysis and evaluations show that our mechanism tightens the privacy loss while maintaining a promising model utility.
Original language | English (US) |
---|---|
Title of host publication | Neural Information Processing - 28th International Conference, ICONIP 2021, Proceedings |
Editors | Teddy Mantoro, Minho Lee, Media Anugerah Ayu, Kok Wai Wong, Achmad Nizar Hidayanto |
Publisher | Springer Science and Business Media Deutschland GmbH |
Pages | 334-343 |
Number of pages | 10 |
ISBN (Print) | 9783030923099 |
DOIs | |
State | Published - 2021 |
Event | 28th International Conference on Neural Information Processing, ICONIP 2021 - Virtual, Online Duration: Dec 8 2021 → Dec 12 2021 |
Publication series
Name | Communications in Computer and Information Science |
---|---|
Volume | 1517 CCIS |
ISSN (Print) | 1865-0929 |
ISSN (Electronic) | 1865-0937 |
Conference
Conference | 28th International Conference on Neural Information Processing, ICONIP 2021 |
---|---|
City | Virtual, Online |
Period | 12/8/21 → 12/12/21 |
All Science Journal Classification (ASJC) codes
- General Computer Science
- General Mathematics
Keywords
- Continual learning
- Deep learning
- Differential privacy