Abstract:To enhance the anomaly recognition ability of traffic time series data, a hybrid model was constructed. Firstly, the multi-head attention, residuals and probabilistic sparse self-attention were combined to form a global feature recognition (GFR) module, enhancing the ability while reducing computational complexity. Secondly, the dilated convolution and multi-scale convolution were combined to form a local feature recognition (LFR) module, further optimizing local feature extraction. Thirdly, the FreeRunning training strategy was used to improve model robustness. Fourthly, the modules and training strategy were combined with LSTM, while the result of the self-attention mechanism replaced the LSTM input gate, so as to optimize long sequence memory ability and reduce computing complexity. Finally, a multivariate Gaussian distribution probability function was used to discriminate anomalies. The results show that adding each module on the basis of LSTM significantly improves the model's prediction and anomaly detection ability; Compared with the general hybrid model Transformer-Bi-LSTM, the proposed model has stronger prediction ability and lower computational complexity. The proposed model performs effectively in recognizing both global and local anomalies in traffic time series data, which provides reference for improving the operational efficiency and safety of the traffic system.