面包屑 Home / Research News / CASAformer: Congestion-aware sparse attention transformer for traffic speed prediction CASAformer: Congestion-aware sparse attention transformer for traffic speed prediction 10 Apr 2025 The team led by Assistant Professor Yifan Zhang proposed an innovative model: CASAformer. As a novel congestion-aware sparse attention Transformer, CASAformer focuses on improving the accuracy of traffic speed prediction under congestion conditions. To address the shortcomings of existing models in low-speed prediction, this model designs a unique sparse attention mechanism to focus on key congestion nodes, and introduces an adaptive loss function to tackle the problem of data imbalance. Validation on public datasets demonstrates that the proposed model significantly outperforms existing state-of-the-art models, especially in predicting traffic speeds below 40 mph. It provides a more reliable decision-making basis for intelligent traffic management and control.Source:Zhang, Y., Zhou, Q., Wang, J., Kouvelas, A., & Makridis, M. A. (2025). CASAformer: Congestion-aware sparse attention transformer for traffic speed prediction. Communications in Transportation Research, 5, Article 100174. Advance online publication. https://doi.org/10.1016/j.commtr.2025.100174 Related News CityUHK (DG) Hosts 2026 International Conference on Bioinformatics and Intelligent Computing (BIC 2026) Affordable Mobility for All: CityUHK (Dongguan) Professor Jian Linni Publishes Commentary in Nature CityUHK named the "Most International University in the World" for the third consecutive year Prof Duan Baoyan Delivers "President's First Lecture of the Semester" PROJECT X Innovation Winter Camp Provides Glimpse into Future of Innovation at CityUHK (Dongguan)