ترغب بنشر مسار تعليمي؟ اضغط هنا

S-Frame Discrepancy Correction Models for Data-Informed Reynolds Stress Closure

92   0   0.0 ( 0 )
 نشر من قبل John Evans
 تاريخ النشر 2020
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Despite their well-known limitations, RANS models remain the most commonly employed tool for modeling turbulent flows in engineering practice. RANS models are predicated on the solution of the RANS equations, but these equations involve an unclosed term, the Reynolds stress tensor, which must be modeled. The Reynolds stress tensor is often modeled as an algebraic function of mean flow field variables and turbulence variables. This introduces a discrepancy between the Reynolds stress tensor predicted by the model and the exact Reynolds stress tensor. This discrepancy can result in inaccurate mean flow field predictions. In this paper, we introduce a data-informed approach for arriving at Reynolds stress models with improved predictive performance. Our approach relies on learning the components of the Reynolds stress discrepancy tensor associated with a given Reynolds stress model in the mean strain-rate tensor eigenframe. These components are typically smooth and hence simple to learn using state-of-the-art machine learning strategies and regression techniques. Our approach automatically yields Reynolds stress models that are symmetric, and it yields Reynolds stress models that are both Galilean and frame invariant provided the inputs are themselves Galilean and frame invariant. To arrive at computable models of the discrepancy tensor, we employ feed-forward neural networks and an input space spanning the integrity basis of the mean strain-rate tensor, the mean rotation-rate tensor, the mean pressure gradient, and the turbulent kinetic energy gradient, and we introduce a framework for dimensional reduction of the input space to further reduce computational cost. Numerical results illustrate the effectiveness of the proposed approach for data-informed Reynolds stress closure for a suite of turbulent flow problems of increasing complexity.

قيم البحث

اقرأ أيضاً

Reynolds-averaged Navier-Stokes (RANS) equations are presently one of the most popular models for simulating turbulence. Performing RANS simulation requires additional modeling for the anisotropic Reynolds stress tensor, but traditional Reynolds stre ss closure models lead to only partially reliable predictions. Recently, data-driven turbulence models for the Reynolds anisotropy tensor involving novel machine learning techniques have garnered considerable attention and have been rapidly developed. Focusing on modeling the Reynolds stress closure for the specific case of turbulent channel flow, this paper proposes three modifications to a standard neural network to account for the no-slip boundary condition of the anisotropy tensor, the Reynolds number dependence, and spatial non-locality. The modified models are shown to provide increased predicative accuracy compared to the standard neural network when they are trained and tested on channel flow at different Reynolds numbers. The best performance is yielded by the model combining the boundary condition enforcement and Reynolds number injection. This model also outperforms the Tensor Basis Neural Network (Ling et al., 2016) on the turbulent channel flow dataset.
A novel machine learning algorithm is presented, serving as a data-driven turbulence modeling tool for Reynolds Averaged Navier-Stokes (RANS) simulations. This machine learning algorithm, called the Tensor Basis Random Forest (TBRF), is used to predi ct the Reynolds-stress anisotropy tensor, while guaranteeing Galilean invariance by making use of a tensor basis. By modifying a random forest algorithm to accept such a tensor basis, a robust, easy to implement, and easy to train algorithm is created. The algorithm is trained on several flow cases using DNS/LES data, and used to predict the Reynolds stress anisotropy tensor for new, unseen flows. The resulting predictions of turbulence anisotropy are used as a turbulence model within a custom RANS solver. Stabilization of this solver is necessary, and is achieved by a continuation method and a modified $k$-equation. Results are compared to the neural network approach of Ling et al. [J. Fluid Mech, 807(2016):155-166, (2016)]. Results show that the TBRF algorithm is able to accurately predict the anisotropy tensor for various flow cases, with realizable predictions close to the DNS/LES reference data. Corresponding mean flows for a square duct flow case and a backward facing step flow case show good agreement with DNS and experimental data-sets. Overall, these results are seen as a next step towards improved data-driven modelling of turbulence. This creates an opportunity to generate custom turbulence closures for specific classes of flows, limited only by the availability of LES/DNS data.
207 - Yilang Liu , Weiwei Zhang 2020
This paper proposes a new data assimilation method for recovering high fidelity turbulent flow field around airfoil at high Reynolds numbers based on experimental data, which is called Proper Orthogonal Decomposition Inversion (POD-Inversion) data as similation method. Aiming at the flows including shock wave discontinuities or separated flows at high angle of attack, the proposed method can reconstruct high-fidelity turbulent flow field combining with experimental distributed force coefficients. We firstly perform the POD analysis to the turbulent eddy viscosity fields computed by SA model and obtain the base POD modes. Then optimized the POD coefficients by global optimization algorithm coupling with the Navier-Stokes equations solver. The high-fidelity turbulent flied are recovered by several main modes, which can dramatically reduce the dimensions of the system. The effectiveness of the method is verified by the cases of transonic flow around the RAE2822 airfoil at high Reynolds numbers and the separated flow at high angles of attack. The results demonstrate that the proposed assimilation method can recover the turbulent flow field which optimally match the experimental data, and significantly reduce the error of pressure coefficients. The proposed data assimilation method can offer high-fidelity field data for turbulent model based on machine learning.
Near-wall blood flow and wall shear stress (WSS) regulate major forms of cardiovascular disease, yet they are challenging to quantify with high fidelity. Patient-specific computational and experimental measurement of WSS suffers from uncertainty, low resolution, and noise issues. Physics-informed neural networks (PINN) provide a flexible deep learning framework to integrate mathematical equations governing blood flow with measurement data. By leveraging knowledge about the governing equations (herein, Navier-Stokes), PINN overcomes the large data requirement in deep learning. In this study, it was shown how PINN could be used to improve WSS quantification in diseased arterial flows. Specifically, blood flow problems where the inlet and outlet boundary conditions were not known were solved by assimilating very few measurement points. Uncertainty in boundary conditions is a common feature in patient-specific computational fluid dynamics models. It was shown that PINN could use sparse velocity measurements away from the wall to quantify WSS with very high accuracy even without full knowledge of the boundary conditions. Examples in idealized stenosis and aneurysm models were considered demonstrating how partial knowledge about the flow physics could be combined with partial measurements to obtain accurate near-wall blood flow data. The proposed hybrid data-driven and physics-based deep learning framework has high potential in transforming high-fidelity near-wall hemodynamics modeling in cardiovascular disease.
142 - Bo Liu , Huiyang Yu , Haibo Huang 2021
A nonlocal subgrid-scale stress (SGS) model is developed based on the convolution neural network (CNN), a powerful supervised data-driven approach. The CNN is an ideal approach to naturally consider nonlocal spatial information in prediction due to i ts wide receptive field. The CNN-based models used here only take primitive flow variables as input, then the flow features are automatically extracted without any $priori$ guidance. The nonlocal models trained by direct numerical simulation (DNS) data of a turbulent channel flow at $Re_{tau}=178$ are accessed in both the $priori$ and $posteriori$ test, providing physically reasonable flow statistics (like mean velocity and velocity fluctuations) closing to the DNS results even when extrapolating to a higher Reynolds number $Re_{tau}=600$. In our model, the backscatter is also predicted well and the numerical simulation is stable. The nonlocal models outperform local data-driven models like artificial neural network and some SGS models, e.g. the Smagorinsky model in actual large eddy simulation (LES). The model is also robust since stable solutions can be obtained when examining the grid resolution from one-half to double of the spatial resolution used in training. We also investigate the influence of receptive fields and suggest using the two-point correlation analysis as a quantitative method to guide the design of nonlocal physical models. To facilitate the combination of machine learning (ML) algorithms to computational fluid dynamics (CFD), a novel heterogeneous ML-CFD framework is proposed. The present study provides the effective data-driven nonlocal methods for SGS modelling in the LES of complex anisotropic turbulent flows.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا