Full Text:   <1008>

CLC number: TP18

On-line Access: 2024-08-27

Received: 2023-10-17

Revision Accepted: 2024-05-08

Crosschecked: 2023-06-08

Cited: 0

Clicked: 1609

Citations:  Bibtex RefMan EndNote GB/T7714

 ORCID:

Kai LU

https://orcid.org/0000-0002-6378-7002

Xugang WU

https://orcid.org/0000-0003-4715-6785

Ruibo WANG

https://orcid.org/0000-0001-7952-3784

-   Go to

Article info.
Open peer comments

Frontiers of Information Technology & Electronic Engineering  2024 Vol.25 No.3 P.369-383

http://doi.org/10.1631/FITEE.2300194


Towards adaptive graph neural networks via solving prior-data conflicts


Author(s):  Xugang WU, Huijun WU, Ruibo WANG, Xu ZHOU, Kai LU

Affiliation(s):  College of Computer, National University of Defense Technology, Changsha 410073, China

Corresponding email(s):   wuxugang13@nudt.edu.cn, ruibo@nudt.edu.cn, lukainudt@163.com

Key Words:  Graph neural networks, Heterophily, Prior-data conflict


Xugang WU, Huijun WU, Ruibo WANG, Xu ZHOU, Kai LU. Towards adaptive graph neural networks via solving prior-data conflicts[J]. Frontiers of Information Technology & Electronic Engineering, 2024, 25(3): 369-383.

@article{title="Towards adaptive graph neural networks via solving prior-data conflicts",
author="Xugang WU, Huijun WU, Ruibo WANG, Xu ZHOU, Kai LU",
journal="Frontiers of Information Technology & Electronic Engineering",
volume="25",
number="3",
pages="369-383",
year="2024",
publisher="Zhejiang University Press & Springer",
doi="10.1631/FITEE.2300194"
}

%0 Journal Article
%T Towards adaptive graph neural networks via solving prior-data conflicts
%A Xugang WU
%A Huijun WU
%A Ruibo WANG
%A Xu ZHOU
%A Kai LU
%J Frontiers of Information Technology & Electronic Engineering
%V 25
%N 3
%P 369-383
%@ 2095-9184
%D 2024
%I Zhejiang University Press & Springer
%DOI 10.1631/FITEE.2300194

TY - JOUR
T1 - Towards adaptive graph neural networks via solving prior-data conflicts
A1 - Xugang WU
A1 - Huijun WU
A1 - Ruibo WANG
A1 - Xu ZHOU
A1 - Kai LU
J0 - Frontiers of Information Technology & Electronic Engineering
VL - 25
IS - 3
SP - 369
EP - 383
%@ 2095-9184
Y1 - 2024
PB - Zhejiang University Press & Springer
ER -
DOI - 10.1631/FITEE.2300194


Abstract: 
graph neural networks (GNNs) have achieved remarkable performance in a variety of graph-related tasks. Recent evidence in the GNN community shows that such good performance can be attributed to the homophily prior; i.e., connected nodes tend to have similar features and labels. However, in heterophilic settings where the features of connected nodes may vary significantly, GNN models exhibit notable performance deterioration. In this work, we formulate this problem as prior-data conflict and propose a model called the mixture-prior graph neural network (MPGNN). First, to address the mismatch of homophily prior on heterophilic graphs, we introduce the non-informative prior, which makes no assumptions about the relationship between connected nodes and learns such relationship from the data. Second, to avoid performance degradation on homophilic graphs, we implement a soft switch to balance the effects of homophily prior and non-informative prior by learnable weights. We evaluate the performance of MPGNN on both synthetic and real-world graphs. Results show that MPGNN can effectively capture the relationship between connected nodes, while the soft switch helps select a suitable prior according to the graph characteristics. With these two designs, MPGNN outperforms state-of-the-art methods on heterophilic graphs without sacrificing performance on homophilic graphs.

通过解决先验数据冲突实现自适应图神经网络

吴旭刚,邬会军,王睿伯,周旭,卢凯
国防科技大学计算机学院,中国长沙市,410073
摘要:图神经网络(GNN)在各种与图相关的任务中已取得显著性能。最近GNN社区的证据表明,这种良好的性能可归因于同质性先验,即连接的节点倾向于具有相似的特征和标签。然而,在异配性设置中,连接节点的特征可能会有显著变化,导致GNN模型性能明显下降。本文将此问题定义为先验数据冲突,提出一种名为混合先验图神经网络(MPGNN)的模型。首先,为解决异配图上同质性先验不匹配的问题,引入无信息先验,它不对连接节点之间的关系做任何假设,并从数据中学习这种关系。其次,为避免同质图上性能下降,通过可学习的权重实现软开关,以平衡同质性先验和非信息先验的影响。评估了MPGNN在合成图和真实世界图上的性能。结果表明,MPGNN能够有效捕捉连接节点之间的关系,而软开关有助于根据图的特征选择合适的先验。基于这两个设计,MPGNN在异配图上优于最先进的方法,而在同质图上不会牺牲性能。

关键词:图神经网络;异配性;先验数据冲突

Darkslateblue:Affiliate; Royal Blue:Author; Turquoise:Article

Reference

[1]Arpit D, Jastrzebski S, Ballas N, et al., 2017. A closer look at memorization in deep networks. Proc 34th Int Conf on Machine Learning, p.233-242.

[2]Chiang WL, Liu XQ, Si S, et al., 2019. Cluster-GCN: an efficient algorithm for training deep and large graph convolutional networks. Proc 25th ACM SIGKDD Int Conf on Knowledge Discovery & Data Mining, p.257-266.

[3]Chien E, Peng JH, Li P, et al., 2021. Adaptive universal generalized PageRank graph neural network. Proc 9th Int Conf on Learning Representations.

[4]Ciotti V, Bonaventura M, Nicosia V, et al., 2016. Homophily and missing links in citation networks. EPJ Data Sci, 5(1):7.

[5]Dempster AP, Laird NM, Rubin DB, 1977. Maximum likelihood from incomplete data via the EM algorithm. J R Stat Soc Ser B (Methodol), 39(1):1-22.

[6]Deshpande Y, Montanari A, Mossel E, et al., 2018. Contextual stochastic block models. Proc 32nd Int Conf on Neural Information Processing Systems, p.8590-8602.

[7]Dong HD, Chen JW, Feng FL, et al., 2021. On the equivalence of decoupled graph convolution network and label propagation. Proc Web Conf, p.3651-3662.

[8]Feldman V, 2020. Does learning require memorization? A short tale about a long tail. Proc 52nd Annual ACM SIGACT Symp on Theory of Computing, p.954-959.

[9]Fey M, Lenssen JE, 2019. Fast graph representation learning with PyTorch geometric. https://arxiv.org/abs/1903.02428

[10]Grover A, Leskovec J, 2016. node2vec: scalable feature learning for networks. Proc 22nd ACM SIGKDD Int Conf on Knowledge Discovery and Data Mining, p.855-864.

[11]Hamilton WL, Ying R, Leskovec J, 2017. Inductive representation learning on large graphs. Proc 31st Int Conf on Neural Information Processing Systems, p.1025-1035.

[12]He MG, Wei ZW, Huang ZF, et al., 2021. BernNet: learning arbitrary graph spectral filters via Bernstein approximation. Proc 35th Int Conf on Neural Information Processing Systems, p.14239-14251.

[13]Hochreiter S, Schmidhuber J, 1997. Long short-term memory. Neur Comput, 9(8):1735-1780.

[14]Hu WH, Fey M, Zitnik M, et al., 2020. Open graph benchmark: datasets for machine learning on graphs. Proc 34th Int Conf on Neural Information Processing Systems, Article 1855.

[15]Huang Q, He H, Singh A, et al., 2021. Combining label propagation and simple models outperforms graph neural networks. Proc 9th Int Conf on Learning Representations.

[16]Jeh G, Widom J, 2003. Scaling personalized web search. Proc 12th Int Conf on World Wide Web, p.271-279.

[17]Jin D, Yu ZZ, Huo CY, et al., 2021. Universal graph convolutional networks. Proc 35th Int Conf on Neural Information Processing Systems, p.10654-10664.

[18]Kipf TN, Welling M, 2017. Semi-supervised classification with graph convolutional networks. Proc 5th Int Conf on Learning Representations.

[19]Klicpera J, Bojchevski A, Günnemann S, 2019. Predict then propagate: graph neural networks meet personalized PageRank. Proc 7th Int Conf on Learning Representations.

[20]Krizhevsky A, Sutskever I, Hinton GE, 2017. ImageNet classification with deep convolutional neural networks. Commun ACM, 60(6):84-90.

[21]Leskovec J, Krevl A, 2014. SNAP Datasets: Stanford Large Network Dataset Collection. http://snap.stanford.edu/data/

[22]Leskovec J, Kleinberg J, Faloutsos C, 2005. Graphs over time: densification laws, shrinking diameters and possible explanations. Proc 11th ACM SIGKDD Int Conf on Knowledge Discovery in Data Mining, p.177-187.

[23]Lim D, Hohne F, Li XY, et al., 2021. Large scale learning on non-homophilous graphs: new benchmarks and strong simple methods. Proc 35th Int Conf on Neural Information Processing Systems, p.20887-20902.

[24]Ma JX, Zhou C, Cui P, et al., 2019. Learning disentangled representations for recommendation. Proc 33rd Int Conf on Neural Information Processing Systems, Article 513.

[25]McCallum AK, Nigam K, Rennie J, et al., 2000. Automating the construction of Internet portals with machine learning. Inform Retr, 3(2):127-163.

[26]McLachlan GJ, Krishnan T, 1997. The EM Algorithm and Extensions. John Wiley & Sons, New York, USA.

[27]McPherson M, Smith-Lovin L, Cook JM, 2001. Birds of a feather: homophily in social networks. Ann Rev Sociol, 27:415-444.

[28]Pei HB, Wei BZ, Chang KCC, et al., 2020. Geom-GCN: geometric graph convolutional networks. Proc 8th Int Conf on Learning Representations.

[29]Sen P, Namata G, Bilgic M, et al., 2008. Collective classification in network data. AI Mag, 29(3):93-106.

[30]Tang J, Sun JM, Wang C, et al., 2009. Social influence analysis in large-scale networks. Proc 15th ACM SIGKDD Int Conf on Knowledge Discovery and Data Mining, p.807-816.

[31]Veličković P, Cucurull G, Casanova A, et al., 2018. Graph attention networks. Proc 6th Int Conf on Learning Representations.

[32]Wang Z, Wang CK, Pei JS, et al., 2016. Causality based propagation history ranking in social network. Proc 25th Int Joint Conf on Artificial Intelligence, p.3917-3923.

[33]Wu F, Souza AHJr, Zhang TY, et al., 2019. Simplifying graph convolutional networks. Proc 36th Int Conf on Machine Learning, p.6861-6871.

[34]Wu ZH, Pan SR, Long GD, et al., 2020. Connecting the dots: multivariate time series forecasting with graph neural networks. Proc 26th ACM SIGKDD Int Conf on Knowledge Discovery & Data Mining, p.753-763.

[35]Xu K, Li CT, Tian YL, et al., 2018. Representation learning on graphs with jumping knowledge networks. Proc 35th Int Conf on Machine Learning, p.5449-5458.

[36]Yang TM, Wang YJ, Yue ZH, et al., 2022. Graph pointer neural networks. Proc 36th AAAI Conf on Artificial Intelligence, p.8832-8839.

[37]Ying R, He RN, Chen KF, et al., 2018. Graph convolutional neural networks for web-scale recommender systems. Proc 24th ACM SIGKDD Int Conf on Knowledge Discovery & Data Mining, p.974-983.

[38]Zhang CY, Bengio S, Hardt M, et al., 2017. Understanding deep learning requires rethinking generalization. Proc 5th Int Conf on Learning Representations.

[39]Zhang MH, Chen YX, 2018. Link prediction based on graph neural networks. Proc 32nd Int Conf on Neural Information Processing Systems, p.5171-5181.

[40]Zhang ZW, Cui P, Zhu WW, 2022. Deep learning on graphs: a survey. IEEE Trans Knowl Data Eng, 34(1):249-270.

[41]Zhao JL, Dong YX, Ding M, et al., 2021. Adaptive diffusion in graph neural networks. Proc 35th Int Conf on Neural Information Processing Systems, p.23321-23333.

[42]Zhu J, Yan YJ, Zhao LX, et al., 2020. Beyond homophily in graph neural networks: current limitations and effective designs. Proc 34th Int Conf on Neural Information Processing Systems, Article 653.

[43]Zhu J, Rossi RA, Rao A, et al., 2021. Graph neural networks with heterophily. Proc 35th AAAI Conf on Artificial Intelligence, p.11168-11176.

Open peer comments: Debate/Discuss/Question/Opinion

<1>

Please provide your name, email address and a comment





Journal of Zhejiang University-SCIENCE, 38 Zheda Road, Hangzhou 310027, China
Tel: +86-571-87952783; E-mail: cjzhang@zju.edu.cn
Copyright © 2000 - 2024 Journal of Zhejiang University-SCIENCE