Improvement of lattice Boltzmann methods with attention and convolutional neural networks
编号:6 访问权限:仅限参会人 更新:2025-04-10 21:15:45 浏览:6次 口头报告

报告开始:暂无开始时间(Asia/Shanghai)

报告时间:暂无持续时间

所在会场:[暂无会议] [暂无会议段]

暂无文件

摘要
Traditional Lattice Boltzmann Methods (LBM), constrained by localized collision-streaming rules, face challenges in modeling multiscale flow interactions such as vortex merging and pressure wave propagation. We propose a hybrid deep learning framework that synergizes convolutional neural networks (CNNs) and self-attention mechanisms to augment LBM. The CNNs compresses distribution functions into low-dimensional features, and attention weights are scaled by local vorticity magnitude. The method is validated on two-dimensional Poiseuille flow and flow around cylinders. From the results, this method achieves faster convergence to the steady state and higher accuracy in predicting vortex shedding frequencies compared to standard LBM. This vorticity-guided attention mechanism enables LBM to resolve long-range flow interactions without sacrificing parallel efficiency, showing promise for complex flow problems. Overall, this research aims to contribute to advancing LBM in High-Performance Computation.
 
关键词
lattice boltzmann method,attention mechanism,Neural network,parallel computing
报告人
Hao Liu
Graduate student Jiangsu University

稿件作者
Wei Li Jiangsu University
Hao Liu Jiangsu University
发表评论
验证码 看不清楚,更换一张
全部评论
重要日期
  • 会议日期

    07月03日

    2025

    07月06日

    2025

  • 06月25日 2025

    初稿截稿日期

主办单位
Harbin Engineering University, China
承办单位
Harbin Engineering University, China
移动端
在手机上打开
小程序
打开微信小程序
客服
扫码或点此咨询