Full Text:   <3>

CLC number: 

On-line Access: 2025-08-15

Received: 2025-03-14

Revision Accepted: 2025-06-03

Crosschecked: 0000-00-00

Cited: 0

Clicked: 10

Citations:  Bibtex RefMan EndNote GB/T7714

-   Go to

Article info.
Open peer comments

Journal of Zhejiang University SCIENCE C 1998 Vol.-1 No.-1 P.

http://doi.org/10.1631/FITEE.2500162


E-CGL: an efficient continual graph learner


Author(s):  Jianhao GUO, Zixuan NI, Yun ZHU, Siliang TANG

Affiliation(s):  DCD Lab, College of Computer Science and Technology, Zhejiang University, Hangzhou 310027, China

Corresponding email(s):   guojianhao@zju.edu.cn, zixuan2i@zju.edu.cn, zhuyun_dcd@zju.edu.cn, siliang@zju.edu.cn

Key Words:  Graph neural networks (GNN), Continual learning (CL), Dynamic graphs, Continual graph learning (CGL), Graph acceleration


Jianhao GUO, Zixuan NI, Yun ZHU, Siliang TANG. E-CGL: an efficient continual graph learner[J]. Frontiers of Information Technology & Electronic Engineering, 1998, -1(-1): .

@article{title="E-CGL: an efficient continual graph learner",
author="Jianhao GUO, Zixuan NI, Yun ZHU, Siliang TANG",
journal="Frontiers of Information Technology & Electronic Engineering",
volume="-1",
number="-1",
pages="",
year="1998",
publisher="Zhejiang University Press & Springer",
doi="10.1631/FITEE.2500162"
}

%0 Journal Article
%T E-CGL: an efficient continual graph learner
%A Jianhao GUO
%A Zixuan NI
%A Yun ZHU
%A Siliang TANG
%J Journal of Zhejiang University SCIENCE C
%V -1
%N -1
%P
%@ 2095-9184
%D 1998
%I Zhejiang University Press & Springer
%DOI 10.1631/FITEE.2500162

TY - JOUR
T1 - E-CGL: an efficient continual graph learner
A1 - Jianhao GUO
A1 - Zixuan NI
A1 - Yun ZHU
A1 - Siliang TANG
J0 - Journal of Zhejiang University Science C
VL - -1
IS - -1
SP -
EP -
%@ 2095-9184
Y1 - 1998
PB - Zhejiang University Press & Springer
ER -
DOI - 10.1631/FITEE.2500162


Abstract: 
continual learning (CL) has emerged as a crucial paradigm for learning from sequential data while retaining previous knowledge. In the realm of continual graph learning (CGL), where graphs change continually based on streaming data, unique challenges arise that require adaptive and efficient methods, as well as addressing catastrophic forgetting. The first challenge stems from the interdependencies between different graph data, in which previous graphs influence new data distributions. The second challenge is handling large graphs in an efficient manner. To address these challenges, we propose an efficient continual graph learner (E-CGL) in this paper. We address the interdependence issue by demonstrating the effectiveness of replay strategies and introducing a combined sampling approach that considers for both node importance and diversity. To improve efficiency, E-CGL leverages a simple yet effective multi-layer perceptron (MLP) model that shares weights with a graph neural network (GNN) during training, thereby accelerating computation by circumventing the expensive message-passing process. Our method achieves state-of-the-art results on four CGL datasets under two settings, while significantly lowering catastrophic forgetting to an average of -1.1%. Additionally, E-CGL accelerates training and inference times by an average of 15.83× and 4.89×, respectively, across four datasets. These results indicate that E-CGL not only effectively manages correlations between different graph data during continual training but also enhances efficiency in large-scale CGL.

Darkslateblue:Affiliate; Royal Blue:Author; Turquoise:Article

Open peer comments: Debate/Discuss/Question/Opinion

<1>

Please provide your name, email address and a comment





Journal of Zhejiang University-SCIENCE, 38 Zheda Road, Hangzhou 310027, China
Tel: +86-571-87952783; E-mail: cjzhang@zju.edu.cn
Copyright © 2000 - 2025 Journal of Zhejiang University-SCIENCE