Perturbing Eigenvalues with Residual Learning in Graph Convolutional Neural Networks

Shibo Yao, Dantong Yu, Xiangmin Jiao

Research output: Contribution to journalConference articlepeer-review

2 Scopus citations


Network structured data is ubiquitous in natural and social science applications. Graph Convolutional Neural Network (GCN) has attracted significant attention recently due to its success in representing, modeling, and predicting large-scale network data. Various types of graph convolutional filters were proposed to process graph signals to boost the performance of graph-based semi-supervised learning. This paper introduces a novel spectral learning technique called EigLearn, which uses residual learning to perturb the eigenvalues of the graph filter matrix to optimize its capability. EigLearn is relatively easy to implement, yet thorough experimental studies reveal that it is more effective and efficient than prior works on the specific issue, such as LanczosNet and FisherGCN. EigLearn only perturbs a small number of eigenvalues and does not require a complete eigendecomposition. Our investigation shows that EigLearn reaches the maximal performance improvement by perturbing about 30 to 40 eigenvalues, and the EigLearn-based GCN has comparable efficiency as the standard GCN. Furthermore, EigLearn bears a clear explanation in the spectral domain of the graph filter and shows aggregation effects in performance improvement when coupled with different graph filters. Hence, we anticipate that EigLearn may serve as a useful neural unit in various graph-involved neural net architectures.

Original languageEnglish (US)
Pages (from-to)1569-1584
Number of pages16
JournalProceedings of Machine Learning Research
StatePublished - 2021
Event13th Asian Conference on Machine Learning, ACML 2021 - Virtual, Online
Duration: Nov 17 2021Nov 19 2021

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability


  • Eigenvalue perturbation
  • Graph convolution
  • Semi-supervised learning
  • Spectral learning


Dive into the research topics of 'Perturbing Eigenvalues with Residual Learning in Graph Convolutional Neural Networks'. Together they form a unique fingerprint.

Cite this