详细信息
文献类型:期刊文献
英文题名:Relational Context Modeling for Improved Knowledge Graph Completion
作者:Lin, Guoqi[1];Li, Qi[1]
机构:[1]Shaoxing Univ, Shaoxing 312000, Zhejiang, Peoples R China
年份:2025
卷号:33
期号:6
起止页码:2037
外文期刊名:ENGINEERING LETTERS
收录:EI(收录号:20252418600409)、ESCI(收录号:WOS:001506721900028)、Scopus(收录号:2-s2.0-105007727735)、WOS
语种:英文
外文关键词:knowledge graphs; knowledge completion; triple classification; link prediction
外文摘要:graphs (KGs) structure knowledge, but are typically incomplete. Link prediction or knowledge graph completion (KGC) builds on KG facts to infer missing facts. Previous embedding models cannot capture the expressive aspects of deeper, multi-layered models. These systems also assign a static embedding to each object and relationship, ignoring the fact that they behave differently in different graph settings, which limits their performance. In this paper, we propose a method that merges the reception weighted key value model with the TuckER model to overcome these limitations, called RCME. RWKV models sequential information and allows dynamic embeddings, while TuckER provides robust relational decoding. Embedding provides more expressive representations. Our strategy outperforms several state-of-the-art models on link prediction and triple classification on benchmark datasets.
参考文献:
正在载入数据...