Improvement of lattice Boltzmann methods with attention and convolutional neural networks
编号:6
访问权限:仅限参会人
更新:2025-04-10 21:15:45 浏览:6次
口头报告
摘要
Traditional Lattice Boltzmann Methods (LBM), constrained by localized collision-streaming rules, face challenges in modeling multiscale flow interactions such as vortex merging and pressure wave propagation. We propose a hybrid deep learning framework that synergizes convolutional neural networks (CNNs) and self-attention mechanisms to augment LBM. The CNNs compresses distribution functions into low-dimensional features, and attention weights are scaled by local vorticity magnitude. The method is validated on two-dimensional Poiseuille flow and flow around cylinders. From the results, this method achieves faster convergence to the steady state and higher accuracy in predicting vortex shedding frequencies compared to standard LBM. This vorticity-guided attention mechanism enables LBM to resolve long-range flow interactions without sacrificing parallel efficiency, showing promise for complex flow problems. Overall, this research aims to contribute to advancing LBM in High-Performance Computation.
关键词
lattice boltzmann method,attention mechanism,Neural network,parallel computing
稿件作者
Wei Li
Jiangsu University
Hao Liu
Jiangsu University
发表评论