Rethinking temporal graph transformers for outlier detection
2024
Graph outlier detection identifies substructures in graphs that significantly deviate from normal patterns. Traditional graph outlier detection methods are mostly limited to static graphs, which overlook the dynamic nature of real-world graphs and ignore temporal signals providing critical information for detecting outliers. Recently, Transformers revolutionized machine learning on time-series data. However, existing Transformers on temporal graphs face limitations due to their reliance on temporal subgraph extraction, restricted receptive fields, and suboptimal generalization capability beyond link prediction. To address these challenges, we propose TGFormer, a novel Temporal Graph Transformer for outlier detection. TGFormer leverages global attention to model both structural and temporal dependencies within temporal graphs. To improve the scalability, TGFormer partitions large temporal graphs into spatiotemporal patches. These patches are processed by a hierarchical Transformer architecture, which includes intra-patch, inter-patch, and temporal Transformers. We conduct experiments on two public datasets compared with a set of baselines, including graph neural networks, graph outlier detectors, and Transformers based methods. Experimental results demonstrate the superiority of TGFormer. Our analysis and experiments on efficiency metrics prove efficiency of TGFormer.
Research areas