TY - GEN
T1 - Node Similarity Preserving Graph Convolutional Networks
AU - Jin, Wei
AU - Derr, Tyler
AU - Wang, Yiqi
AU - Ma, Yao
AU - Liu, Zitao
AU - Tang, Jiliang
N1 - Publisher Copyright:
© 2021 ACM.
PY - 2021/8/3
Y1 - 2021/8/3
N2 - Graph Neural Networks (GNNs) have achieved tremendous success in various real-world applications due to their strong ability in graph representation learning. GNNs explore the graph structure and node features by aggregating and transforming information within node neighborhoods. However, through theoretical and empirical analysis, we reveal that the aggregation process of GNNs tends to destroy node similarity in the original feature space. There are many scenarios where node similarity plays a crucial role. Thus, it has motivated the proposed framework SimP-GCN that can effectively and efficiently preserve node similarity while exploiting graph structure. Specifically, to balance information from graph structure and node features, we propose a feature similarity preserving aggregation which adaptively integrates graph structure and node features. Furthermore, we employ self-supervised learning to explicitly capture the complex feature similarity and dissimilarity relations between nodes. We validate the effectiveness of SimP-GCN on seven benchmark datasets including three assortative and four disassorative graphs. The results demonstrate that SimP-GCN outperforms representative baselines. Further probe shows various advantages of the proposed framework. The implementation of SimP-GCN is available at https://github.com/ChandlerBang/SimP-GCN.
AB - Graph Neural Networks (GNNs) have achieved tremendous success in various real-world applications due to their strong ability in graph representation learning. GNNs explore the graph structure and node features by aggregating and transforming information within node neighborhoods. However, through theoretical and empirical analysis, we reveal that the aggregation process of GNNs tends to destroy node similarity in the original feature space. There are many scenarios where node similarity plays a crucial role. Thus, it has motivated the proposed framework SimP-GCN that can effectively and efficiently preserve node similarity while exploiting graph structure. Specifically, to balance information from graph structure and node features, we propose a feature similarity preserving aggregation which adaptively integrates graph structure and node features. Furthermore, we employ self-supervised learning to explicitly capture the complex feature similarity and dissimilarity relations between nodes. We validate the effectiveness of SimP-GCN on seven benchmark datasets including three assortative and four disassorative graphs. The results demonstrate that SimP-GCN outperforms representative baselines. Further probe shows various advantages of the proposed framework. The implementation of SimP-GCN is available at https://github.com/ChandlerBang/SimP-GCN.
KW - graph neural networks
KW - node similarity preserving
KW - semi-supervised learning
UR - http://www.scopus.com/inward/record.url?scp=85103025230&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85103025230&partnerID=8YFLogxK
U2 - 10.1145/3437963.3441735
DO - 10.1145/3437963.3441735
M3 - Conference contribution
AN - SCOPUS:85103025230
T3 - WSDM 2021 - Proceedings of the 14th ACM International Conference on Web Search and Data Mining
SP - 148
EP - 156
BT - WSDM 2021 - Proceedings of the 14th ACM International Conference on Web Search and Data Mining
PB - Association for Computing Machinery, Inc
T2 - 14th ACM International Conference on Web Search and Data Mining, WSDM 2021
Y2 - 8 March 2021 through 12 March 2021
ER -