TY - GEN
T1 - FedMTL
T2 - 27th European Conference on Artificial Intelligence, ECAI 2024
AU - Sen, Pritam
AU - Borcea, Cristian
N1 - Publisher Copyright:
© 2024 The Authors.
PY - 2024/10/16
Y1 - 2024/10/16
N2 - Multi-task learning (MTL) enables simultaneous learning of related tasks, enhancing the generalization performance of each task and facilitating faster training and inference on resource-constrained devices. Federated Learning (FL) can further enhance performance by enabling collaboration among devices, effectively leveraging distributed data to improve model performance, while ensuring that the raw data remains on the respective devices. However, conventional FL is inadequate for handling MTL models trained on different sets of tasks. This paper proposes FedMTL, a new FL aggregation technique that handles task heterogeneity across users. FedMTL generates personalized MTL models based on task similarities, which are determined by analyzing the parameters for the task-specific layers of the trained models. To prevent privacy leakage through these model parameters and to protect the privacy of the task types, FedMTL employs low-overhead algorithms that are adaptable to existing techniques for secure aggregation. Extensive experiments on three datasets demonstrate that FedMTL performs better than state-of-the-art approaches. Additionally, we implement the FedMTL aggregation algorithm using secure multi-party computation, showing that it can achieve the same accuracy with the plain-text version while preserving privacy.
AB - Multi-task learning (MTL) enables simultaneous learning of related tasks, enhancing the generalization performance of each task and facilitating faster training and inference on resource-constrained devices. Federated Learning (FL) can further enhance performance by enabling collaboration among devices, effectively leveraging distributed data to improve model performance, while ensuring that the raw data remains on the respective devices. However, conventional FL is inadequate for handling MTL models trained on different sets of tasks. This paper proposes FedMTL, a new FL aggregation technique that handles task heterogeneity across users. FedMTL generates personalized MTL models based on task similarities, which are determined by analyzing the parameters for the task-specific layers of the trained models. To prevent privacy leakage through these model parameters and to protect the privacy of the task types, FedMTL employs low-overhead algorithms that are adaptable to existing techniques for secure aggregation. Extensive experiments on three datasets demonstrate that FedMTL performs better than state-of-the-art approaches. Additionally, we implement the FedMTL aggregation algorithm using secure multi-party computation, showing that it can achieve the same accuracy with the plain-text version while preserving privacy.
UR - http://www.scopus.com/inward/record.url?scp=85213391952&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85213391952&partnerID=8YFLogxK
U2 - 10.3233/FAIA240715
DO - 10.3233/FAIA240715
M3 - Conference contribution
AN - SCOPUS:85213391952
T3 - Frontiers in Artificial Intelligence and Applications
SP - 1993
EP - 2002
BT - ECAI 2024 - 27th European Conference on Artificial Intelligence, Including 13th Conference on Prestigious Applications of Intelligent Systems, PAIS 2024, Proceedings
A2 - Endriss, Ulle
A2 - Melo, Francisco S.
A2 - Bach, Kerstin
A2 - Bugarin-Diz, Alberto
A2 - Alonso-Moral, Jose M.
A2 - Barro, Senen
A2 - Heintz, Fredrik
PB - IOS Press BV
Y2 - 19 October 2024 through 24 October 2024
ER -