Pace control via adaptive dropout for federated training: A work-in-progress report

Feiyang Wang, Xiaowei Shang, Jianchen Shan, Xiaoning Ding

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

This paper proposes a neuron drop-out mechanism to control the training paces of mobile devices in federated deep learning. The aim is to accelerate the speed of local training on slow mobile devices with minimal impact on training quality, such that slow mobile devices can catch up with fast devices in each training round to increase the overall training speed. The basic idea is to avoid the computation of some neurons with low activation values (i.e., neuron dropout), and dynamically adjust dropout rates based on the training progress on each mobile device. The paper introduces two techniques for selecting neurons, LSH and Max Heap, and a method for dynamically adjusting dropout rates. It also discusses a few other approaches that can be used to control training paces.

Original languageEnglish (US)
Title of host publicationProceedings - 2020 IEEE Cloud Summit, Cloud Summit 2020
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages176-179
Number of pages4
ISBN (Electronic)9781728182667
DOIs
StatePublished - Oct 2020
Event2020 IEEE Cloud Summit, Cloud Summit 2020 - Virtual, Harrisburg, United States
Duration: Oct 21 2020Oct 22 2020

Publication series

NameProceedings - 2020 IEEE Cloud Summit, Cloud Summit 2020

Conference

Conference2020 IEEE Cloud Summit, Cloud Summit 2020
CountryUnited States
CityVirtual, Harrisburg
Period10/21/2010/22/20

All Science Journal Classification (ASJC) codes

  • Computer Networks and Communications
  • Computer Science Applications
  • Information Systems and Management
  • Control and Optimization

Fingerprint Dive into the research topics of 'Pace control via adaptive dropout for federated training: A work-in-progress report'. Together they form a unique fingerprint.

Cite this