登录    注册    忘记密码

详细信息

Temporal Graph Transformer for Dynamic Network  ( CPCI-S收录 EI收录)   被引量:5

文献类型:期刊文献

英文题名:Temporal Graph Transformer for Dynamic Network

作者:Wang, Zehong[1];Li, Qi[1];Yu, Donghua[1,2];Han, Xiaolong[1]

机构:[1]Shaoxing Univ, Dept Comp Sci & Engn, Shaoxing 312000, Peoples R China;[2]Soochow Univ, Sch Comp Sci & Technol, Suzhou 215000, Peoples R China

年份:2022

卷号:13530

起止页码:694

外文期刊名:ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2022, PT II

收录:CPCI-S(收录号:WOS:000866212300057)、EI(收录号:20223912808361)、Scopus(收录号:2-s2.0-85138745550)、WOS

基金:This work was supported in part by Zhejiang Natural Science Foundation of China No. LY22F020003 and National Natural Science Foundation of China under Grant No. 62002226, No. 62002227.

语种:英文

外文关键词:Temporal graph; Continuous-time dynamic graph; Graph neural network; Graph embedding

外文摘要:Graph neural networks (GNN) have received great attention in recent years due to their unique role in mining graph-based data. Although most work focuses on learning low-dimensional node representation in static graphs, the dynamic nature of real-world networks makes temporal graphs more practical and significant. Continuous-time dynamic graph (CTDG) is a general approach to express temporal networks in fine granularity. Owing to the high time consumption in training and inference, existing CTDG-based algorithms capture information from 1-hop neighbors, ignoring the messages from high-order neighbors, which inevitably leads to model degradation. To overcome the challenge, we propose Temporal Graph Transformer (TGT) to efficiently capture the evolving and semantic information from high-order neighborhoods in dynamic graphs. The proposed TGT consists of three modules, i.e., update module, aggregation module, and propagation module. Different from previous works that aggregate messages layer by layer, the model captures messages from 1-hop and 2-hop neighbors in a single layer. In particular, (1) the update module learns from messages derived from interactions; (2) the aggregation module aggregates 1-hop temporal neighbors to compute node embedding; (3) the propagation module re-updates the hidden state of temporal neighbors to introduce 2-hop information. Experimental results on three real-world networks demonstrate the superiority of TGT in efficacy and efficiency.

参考文献:

正在载入数据...

版权所有©绍兴文理学院 重庆维普资讯有限公司 渝B2-20050021-8
渝公网安备 50019002500408号 违法和不良信息举报中心