Improved Large-Scale Graph Learning through Ridge Spectral Sparsification

  • Daniele Calandriello
  • , Ioannis Koutis
  • , Alessandro Lazaric
  • , Michal Valko

Research output: Contribution to journalConference articlepeer-review

19 Scopus citations

Abstract

The representation and learning benefits of methods based on graph Laplacians, such as Laplacian smoothing or harmonic function solution for semi-supervised learning (SSL), are empirically and theoretically well supported. Nonetheless, the exact versions of these methods scale poorly with the number of nodes n of the graph. In this paper, we combine a spectral sparsification routine with Laplacian learning. Given a graph G as input, our algorithm computes a sparsi-fier in a distributed way in O(n log3 (n)) time, O(m log3 (n)) work and O(n log(n)) memory, using only log(n) rounds of communication. Furthermore, motivated by the regularization often employed in learning algorithms, we show that constructing sparsifiers that preserve the spectrum of the Laplacian only up to the regularization level may drastically reduce the size of the final graph. By constructing a spectrally-similar graph, we are able to bound the error induced by the sparsification for a variety of downstream tasks (e.g., SSL). We empirically validate the theoretical guarantees on Amazon co-purchase graph and compare to the state-of-the-art heuristics.

Original languageEnglish (US)
Pages (from-to)688-697
Number of pages10
JournalProceedings of Machine Learning Research
Volume80
StatePublished - 2018
Event35th International Conference on Machine Learning, ICML 2018 - Stockholm, Sweden
Duration: Jul 10 2018Jul 15 2018

All Science Journal Classification (ASJC) codes

  • Software
  • Control and Systems Engineering
  • Statistics and Probability
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Improved Large-Scale Graph Learning through Ridge Spectral Sparsification'. Together they form a unique fingerprint.

Cite this