No Arabic abstract
The present study proposes a data-driven framework trained with high-fidelity simulation results to facilitate decision making for combustor designs. At its core is a surrogate model employing a machine-learning technique called kriging, which is combined with data-driven basis functions to extract and model the underlying coherent structures. This emulation framework encompasses key design parameter sensitivity analysis, physics-guided classification of design parameter sets, and flow evolution modeling for efficient design survey. To better inform the model of quantifiable physical knowledge, a sensitivity analysis using Sobol indices and a decision tree are incorporated into the framework. This information improves the surrogate model training process, which employs basis functions as regression functions over the design space for the kriging model. The novelty of the proposed approach is the construction of the model through Common Proper Orthogonal Decomposition, which allows for data-reduction and extraction of common coherent structures. The accuracy of prediction of mean flow features for new swirl injector designs is assessed and the dynamic flowfield is captured in the form of power spectrum densities. This data-driven framework also demonstrates the uncertainty quantification of predictions, providing a metric for model fit. The significantly reduced computation time required for evaluating new design points enables efficient survey of the design space.
An extension of Proper Orthogonal Decomposition is applied to the wall layer of a turbulent channel flow (Re {tau} = 590), so that empirical eigenfunctions are defined in both space and time. Due to the statistical symmetries of the flow, the igenfunctions are associated with individual wavenumbers and frequencies. Self-similarity of the dominant eigenfunctions, consistent with wall-attached structures transferring energy into the core region, is established. The most energetic modes are characterized by a fundamental time scale in the range 200-300 viscous wall units. The full spatio-temporal decomposition provides a natural measure of the convection velocity of structures, with a characteristic value of 12 u {tau} in the wall layer. Finally, we show that the energy budget can be split into specific contributions for each mode, which provides a closed-form expression for nonlinear effects.
This interdisciplinary study, which combines machine learning, statistical methodologies, high-fidelity simulations, and flow physics, demonstrates a new process for building an efficient surrogate model for predicting spatiotemporally evolving flow dynamics. In our previous work, a common-grid proper-orthogonal-decomposition (CPOD) technique was developed to establish a physics-based surrogate (emulation) model for prediction of mean flowfields and design exploration over a wide parameter space. The CPOD technique is substantially improved upon here using a kernel-smoothed POD (KSPOD) technique, which leverages kriging-based weighted functions from the design matrix. The resultant emulation model is then trained using a dataset obtained through high-fidelity simulations. As an example, the flow evolution in a swirl injector is considered for a wide range of design parameters and operating conditions. The KSPOD-based emulation model performs well, and can faithfully capture the spatiotemporal flow dynamics. The model enables effective design surveys utilizing high-fidelity simulation data, achieving a turnaround time for evaluating new design points that is 42,000 times faster than the original simulation.
Applying security as a lifecycle practice is becoming increasingly important to combat targeted attacks in safety-critical systems. Among others there are two significant challenges in this area: (1) the need for models that can characterize a realistic system in the absence of an implementation and (2) an automated way to associate attack vector information; that is, historical data, to such system models. We propose the cybersecurity body of knowledge (CYBOK), which takes in sufficiently characteristic models of systems and acts as a search engine for potential attack vectors. CYBOK is fundamentally an algorithmic approach to vulnerability exploration, which is a significant extension to the body of knowledge it builds upon. By using CYBOK, security analysts and system designers can work together to assess the overall security posture of systems early in their lifecycle, during major design decisions and before final product designs. Consequently, assisting in applying security earlier and throughout the systems lifecycle.
The proper orthogonal decomposition (POD) is a powerful classical tool in fluid mechanics used, for instance, for model reduction and extraction of coherent flow features. However, its applicability to high-resolution data, as produced by three-dimensional direct numerical simulations, is limited owing to its computational complexity. Here, we propose a wavelet-based adaptive version of the POD (the wPOD), in order to overcome this limitation. The amount of data to be analyzed is reduced by compressing them using biorthogonal wavelets, yielding a sparse representation while conveniently providing control of the compression error. Numerical analysis shows how the distinct error contributions of wavelet compression and POD truncation can be balanced under certain assumptions, allowing us to efficiently process high-resolution data from three-dimensional simulations of flow problems. Using a synthetic academic test case, we compare our algorithm with the randomized singular value decomposition. Furthermore, we demonstrate the ability of our method analyzing data of a 2D wake flow and a 3D flow generated by a flapping insect computed with direct numerical simulation.
Data-driven design of mechanical metamaterials is an increasingly popular method to combat costly physical simulations and immense, often intractable, geometrical design spaces. Using a precomputed dataset of unit cells, a multiscale structure can be quickly filled via combinatorial search algorithms, and machine learning models can be trained to accelerate the process. However, the dependence on data induces a unique challenge: An imbalanced dataset containing more of certain shapes or physical properties can be detrimental to the efficacy of data-driven approaches. In answer, we posit that a smaller yet diverse set of unit cells leads to scalable search and unbiased learning. To select such subsets, we propose METASET, a methodology that 1) uses similarity metrics and positive semi-definite kernels to jointly measure the closeness of unit cells in both shape and property spaces, and 2) incorporates Determinantal Point Processes for efficient subset selection. Moreover, METASET allows the trade-off between shape and property diversity so that subsets can be tuned for various applications. Through the design of 2D metamaterials with target displacement profiles, we demonstrate that smaller, diverse subsets can indeed improve the search process as well as structural performance. By eliminating inherent overlaps in a dataset of 3D unit cells created with symmetry rules, we also illustrate that our flexible method can distill unique subsets regardless of the metric employed. Our diverse subsets are provided publicly for use by any designer.