Private and Secure Distributed Matrix Multiplication with Flexible Communication Load

Malihe Aliasgari, Osvaldo Simeone, Jorg Kliewer

Research output: Contribution to journalArticlepeer-review

66 Scopus citations

Abstract

Large matrix multiplications are central to large-scale machine learning applications. These operations are often carried out on a distributed computing platform with a master server and multiple workers in the cloud operating in parallel. For such distributed platforms, it has been recently shown that coding over the input data matrices can reduce the computational delay, yielding a trade-off between recovery threshold, i.e., the number of workers required to recover the matrix product, and communication load, i.e., the total amount of data to be downloaded from the workers. In this paper, in addition to exact recovery requirements, we impose security and privacy constraints on the data matrices, and study the recovery threshold as a function of the communication load. We first assume that both matrices contain private information and that workers can collude to eavesdrop on the content of these data matrices. For this problem, we introduce a novel class of secure codes, referred to as secure generalized PolyDot (SGPD) codes, that generalize state-of-the-art non-secure codes for matrix multiplication. SGPD codes allow a flexible trade-off between recovery threshold and communication load for a fixed maximum number of colluding workers while providing perfect secrecy for the two data matrices. We then study a connection between secure matrix multiplication and private information retrieval. We specifically assume that one of the data matrices is taken from a public set known to all the workers. In this setup, the identity of the matrix of interest should be kept private from the workers. For this model, we present a variant of generalized PolyDot codes that can guarantee both secrecy of one matrix and privacy for the identity of the other matrix for the case of no colluding servers.

Original languageEnglish (US)
Article number8985291
Pages (from-to)2722-2734
Number of pages13
JournalIEEE Transactions on Information Forensics and Security
Volume15
DOIs
StatePublished - 2020

All Science Journal Classification (ASJC) codes

  • Safety, Risk, Reliability and Quality
  • Computer Networks and Communications

Keywords

  • Coded distributed computation
  • distributed learning
  • information theoretic security
  • private information retrieval
  • secret sharing

Fingerprint

Dive into the research topics of 'Private and Secure Distributed Matrix Multiplication with Flexible Communication Load'. Together they form a unique fingerprint.

Cite this