Deep learning for turbulent channel flow


الملخص بالإنكليزية

Turbulence modeling is a classical approach to address the multiscale nature of fluid turbulence. Instead of resolving all scales of motion, which is currently mathematically and numerically intractable, reduced models that capture the large-scale behavior are derived. One of the most popular reduced models is the Reynolds averaged Navier-Stokes (RANS) equations. The goal is to solve the RANS equations for the mean velocity and pressure field. However, the RANS equations contain a term called the Reynolds stress tensor, which is not known in terms of the mean velocity field. Many RANS turbulence models have been proposed to model the Reynolds stress tensor in terms of the mean velocity field, but are usually not suitably general for all flow fields of interest. Data-driven turbulence models have recently garnered considerable attention and have been rapidly developed. In a seminal work, Ling et al (2016) developed the tensor basis neural network (TBNN), which was used to learn a general Galilean invariant model for the Reynolds stress tensor. The TBNN was applied to a variety of flow fields with encouraging results. In the present study, the TBNN is applied to the turbulent channel flow. Its performance is compared with classical turbulence models as well as a neural network model that does not preserve Galilean invariance. A sensitivity study on the TBNN reveals that the network attempts to adjust to the dataset, but is limited by the mathematical form that guarantees Galilean invariance.

تحميل البحث