Submitted:
30 December 2025
Posted:
31 December 2025
You are already at the latest version
Abstract
To address the practical challenges of diverse anomaly patterns, strongly coupled dependencies, and high labeling costs in large-scale complex infrastructures, this paper presents an unsupervised anomaly detection method that integrates graph neural networks with Transformer models. The approach learns normal system behavior and identifies deviations without relying on anomaly labels. Infrastructure components are abstracted as nodes in a dependency graph, where nodes are characterized by multiple source observability signals. A graph encoder aggregates neighborhood information to produce structure-enhanced node representations. Self-attention mechanisms are introduced along the temporal dimension to capture long-range dynamic dependencies. This design enables joint modeling of structural relations and temporal evolution. A reconstruction-based training strategy is adopted to constrain the learning of normal patterns. Reconstruction error is used to derive anomaly scores for detection. To ensure reproducibility and ease of deployment, complete specifications of data organization, training procedures, and key hyperparameter settings are provided. Comparative experiments on public benchmarks demonstrate overall advantages across multiple evaluation metrics and confirm the effectiveness of the proposed framework in representing anomaly propagation and temporal drift characteristics in complex systems.