Traditional Lattice Boltzmann Methods (LBM), constrained by localized collision-streaming rules, face challenges in modeling multiscale flow interactions such as vortex merging and pressure wave propagation. We propose a hybrid deep learning framework that synergizes convolutional neural networks (CNNs) and self-attention mechanisms to augment LBM. The CNNs compresses distribution functions into low-dimensional features, and attention weights are scaled by local vorticity magnitude. The method is validated on two-dimensional Poiseuille flow and flow around cylinders. From the results, this method achieves faster convergence to the steady state and higher accuracy in predicting vortex shedding frequencies compared to standard LBM. This vorticity-guided attention mechanism enables LBM to resolve long-range flow interactions without sacrificing parallel efficiency, showing promise for complex flow problems. Overall, this research aims to contribute to advancing LBM in High-Performance Computation.