面包屑 Home / Research News / CASAformer: Congestion-aware sparse attention transformer for traffic speed prediction CASAformer: Congestion-aware sparse attention transformer for traffic speed prediction 10 Apr 2025 The team led by Assistant Professor Yifan Zhang proposed an innovative model: CASAformer. As a novel congestion-aware sparse attention Transformer, CASAformer focuses on improving the accuracy of traffic speed prediction under congestion conditions. To address the shortcomings of existing models in low-speed prediction, this model designs a unique sparse attention mechanism to focus on key congestion nodes, and introduces an adaptive loss function to tackle the problem of data imbalance. Validation on public datasets demonstrates that the proposed model significantly outperforms existing state-of-the-art models, especially in predicting traffic speeds below 40 mph. It provides a more reliable decision-making basis for intelligent traffic management and control.Source:Zhang, Y., Zhou, Q., Wang, J., Kouvelas, A., & Makridis, M. A. (2025). CASAformer: Congestion-aware sparse attention transformer for traffic speed prediction. Communications in Transportation Research, 5, Article 100174. Advance online publication. https://doi.org/10.1016/j.commtr.2025.100174 Related News PROJECT X Innovation Winter Camp Provides Glimpse into Future of Innovation at CityUHK (Dongguan) ISCAIT 2026 Concludes in CityUHK (DG) with Focus on Frontier Innovation in Computing CityUHK (DG) launches pioneering Venture Creation Master's Programme New Year Message from the President: An Era for International Innovative Universities City University of Hong Kong (Dongguan) Signs MoU with University of Cambridge