Abstract
Extrapolation on Temporal Knowledge Graphs (TKGs) poses a critical obstacle, garnering significant attention in the academic sphere due to its far-reaching implications across various domains and areas of study. Predicting upcoming events through the analysis of historical data involves a complex task that requires the integration of structural patterns from historical graph data and temporal dynamics, which has been the focus of various recent research efforts. However, existing methods face significant limitations. Many approaches fail to effectively differentiate the importance of historical knowledge, leading to suboptimal message passing. Others struggle to capture both local and global temporal dependencies simultaneously, resulting in incomplete temporal representations. Moreover, conventional embedding techniques often overlook dynamic positional information, which is crucial for robust forecasting. To mitigate these issues, we introduce a forecasting framework for TKGs, History Graph Convolution Transformer (HGCT), which enhances graph embedding by integrating a time-aware self-attention mechanism and convolution operation. This methodology combines a Fact Graph Transformer to organize historical knowledge into a structured framework coming with a Temporal Convolutional Transformer that leverages an innovative positional encoding strategy to distill temporal patterns from historical snapshots, thereby enriching the representation of time series data. Our work introduces a refined ConvTransE variant, named Query-ConvTransE, optimized for query-based information processing, which is consolidated into the decoding component. Evaluation of this approach across a diverse range of six benchmark datasets reveals a performance boost, marked by a 4.33 % increase in Hit@k metric and by a 3.84 % increase in Mean Reciprocal Rank metric relative to the state-of-the-art.
Dao H.; Phan N.; Le T.; Nguyen N.-T.,
https://doi.org/10.1016/j.knosys.2025.113358

