Do you want to publish a course? Click here

Neural Network Models for the Anisotropic Reynolds Stress Tensor in Turbulent Channel Flow

126   0   0.0 ( 0 )
 Added by Rui Fang
 Publication date 2019
  fields Physics
and research's language is English




Ask ChatGPT about the research

Reynolds-averaged Navier-Stokes (RANS) equations are presently one of the most popular models for simulating turbulence. Performing RANS simulation requires additional modeling for the anisotropic Reynolds stress tensor, but traditional Reynolds stress closure models lead to only partially reliable predictions. Recently, data-driven turbulence models for the Reynolds anisotropy tensor involving novel machine learning techniques have garnered considerable attention and have been rapidly developed. Focusing on modeling the Reynolds stress closure for the specific case of turbulent channel flow, this paper proposes three modifications to a standard neural network to account for the no-slip boundary condition of the anisotropy tensor, the Reynolds number dependence, and spatial non-locality. The modified models are shown to provide increased predicative accuracy compared to the standard neural network when they are trained and tested on channel flow at different Reynolds numbers. The best performance is yielded by the model combining the boundary condition enforcement and Reynolds number injection. This model also outperforms the Tensor Basis Neural Network (Ling et al., 2016) on the turbulent channel flow dataset.



rate research

Read More

The Reynolds-averaged Navier-Stokes (RANS) equations are widely used in turbulence applications. They require accurately modeling the anisotropic Reynolds stress tensor, for which traditional Reynolds stress closure models only yield reliable results in some flow configurations. In the last few years, there has been a surge of work aiming at using data-driven approaches to tackle this problem. The majority of previous work has focused on the development of fully-connected networks for modeling the anisotropic Reynolds stress tensor. In this paper, we expand upon recent work for turbulent channel flow and develop new convolutional neural network (CNN) models that are able to accurately predict the normalized anisotropic Reynolds stress tensor. We apply the new CNN model to a number of one-dimensional turbulent flows. Additionally, we present interpretability techniques that help drive the model design and provide guidance on the model behavior in relation to the underlying physics.
Turbulence modeling is a classical approach to address the multiscale nature of fluid turbulence. Instead of resolving all scales of motion, which is currently mathematically and numerically intractable, reduced models that capture the large-scale behavior are derived. One of the most popular reduced models is the Reynolds averaged Navier-Stokes (RANS) equations. The goal is to solve the RANS equations for the mean velocity and pressure field. However, the RANS equations contain a term called the Reynolds stress tensor, which is not known in terms of the mean velocity field. Many RANS turbulence models have been proposed to model the Reynolds stress tensor in terms of the mean velocity field, but are usually not suitably general for all flow fields of interest. Data-driven turbulence models have recently garnered considerable attention and have been rapidly developed. In a seminal work, Ling et al (2016) developed the tensor basis neural network (TBNN), which was used to learn a general Galilean invariant model for the Reynolds stress tensor. The TBNN was applied to a variety of flow fields with encouraging results. In the present study, the TBNN is applied to the turbulent channel flow. Its performance is compared with classical turbulence models as well as a neural network model that does not preserve Galilean invariance. A sensitivity study on the TBNN reveals that the network attempts to adjust to the dataset, but is limited by the mathematical form that guarantees Galilean invariance.
We investigate the applicability of machine learning based reduced order model (ML-ROM) to three-dimensional complex flows. As an example, we consider a turbulent channel flow at the friction Reynolds number of $Re_tau=110$ in a minimum domain which can maintain coherent structures of turbulence. Training data set are prepared by direct numerical simulation (DNS). The present ML-ROM is constructed by combining a three-dimensional convolutional neural network autoencoder (CNN-AE) and a long short-term memory (LSTM). The CNN-AE works to map high-dimensional flow fields into a low-dimensional latent space. The LSTM is then utilized to predict a temporal evolution of the latent vectors obtained by the CNN-AE. The combination of CNN-AE and LSTM can represent the spatio-temporal high-dimensional dynamics of flow fields by only integrating the temporal evolution of the low-dimensional latent dynamics. The turbulent flow fields reproduced by the present ML-ROM show statistical agreement with the reference DNS data in time-ensemble sense, which can also be found through an orbit-based analysis. Influences of the population of vortical structures contained in the domain and the time interval used for temporal prediction on the ML- ROM performance are also investigated. The potential and limitation of the present ML-ROM for turbulence analysis are discussed at the end of our presentation.
Despite their well-known limitations, RANS models remain the most commonly employed tool for modeling turbulent flows in engineering practice. RANS models are predicated on the solution of the RANS equations, but these equations involve an unclosed term, the Reynolds stress tensor, which must be modeled. The Reynolds stress tensor is often modeled as an algebraic function of mean flow field variables and turbulence variables. This introduces a discrepancy between the Reynolds stress tensor predicted by the model and the exact Reynolds stress tensor. This discrepancy can result in inaccurate mean flow field predictions. In this paper, we introduce a data-informed approach for arriving at Reynolds stress models with improved predictive performance. Our approach relies on learning the components of the Reynolds stress discrepancy tensor associated with a given Reynolds stress model in the mean strain-rate tensor eigenframe. These components are typically smooth and hence simple to learn using state-of-the-art machine learning strategies and regression techniques. Our approach automatically yields Reynolds stress models that are symmetric, and it yields Reynolds stress models that are both Galilean and frame invariant provided the inputs are themselves Galilean and frame invariant. To arrive at computable models of the discrepancy tensor, we employ feed-forward neural networks and an input space spanning the integrity basis of the mean strain-rate tensor, the mean rotation-rate tensor, the mean pressure gradient, and the turbulent kinetic energy gradient, and we introduce a framework for dimensional reduction of the input space to further reduce computational cost. Numerical results illustrate the effectiveness of the proposed approach for data-informed Reynolds stress closure for a suite of turbulent flow problems of increasing complexity.
A novel machine learning algorithm is presented, serving as a data-driven turbulence modeling tool for Reynolds Averaged Navier-Stokes (RANS) simulations. This machine learning algorithm, called the Tensor Basis Random Forest (TBRF), is used to predict the Reynolds-stress anisotropy tensor, while guaranteeing Galilean invariance by making use of a tensor basis. By modifying a random forest algorithm to accept such a tensor basis, a robust, easy to implement, and easy to train algorithm is created. The algorithm is trained on several flow cases using DNS/LES data, and used to predict the Reynolds stress anisotropy tensor for new, unseen flows. The resulting predictions of turbulence anisotropy are used as a turbulence model within a custom RANS solver. Stabilization of this solver is necessary, and is achieved by a continuation method and a modified $k$-equation. Results are compared to the neural network approach of Ling et al. [J. Fluid Mech, 807(2016):155-166, (2016)]. Results show that the TBRF algorithm is able to accurately predict the anisotropy tensor for various flow cases, with realizable predictions close to the DNS/LES reference data. Corresponding mean flows for a square duct flow case and a backward facing step flow case show good agreement with DNS and experimental data-sets. Overall, these results are seen as a next step towards improved data-driven modelling of turbulence. This creates an opportunity to generate custom turbulence closures for specific classes of flows, limited only by the availability of LES/DNS data.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا