Do you want to publish a course? Click here

Mini-data-driven Deep Arbitrary Polynomial Chaos Expansion for Uncertainty Quantification

93   0   0.0 ( 0 )
 Added by Xiaohu Zheng
 Publication date 2021
and research's language is English




Ask ChatGPT about the research

The surrogate model-based uncertainty quantification method has drawn a lot of attention in recent years. Both the polynomial chaos expansion (PCE) and the deep learning (DL) are powerful methods for building a surrogate model. However, the PCE needs to increase the expansion order to improve the accuracy of the surrogate model, which causes more labeled data to solve the expansion coefficients, and the DL also needs a lot of labeled data to train the neural network model. This paper proposes a deep arbitrary polynomial chaos expansion (Deep aPCE) method to improve the balance between surrogate model accuracy and training data cost. On the one hand, the multilayer perceptron (MLP) model is used to solve the adaptive expansion coefficients of arbitrary polynomial chaos expansion, which can improve the Deep aPCE model accuracy with lower expansion order. On the other hand, the adaptive arbitrary polynomial chaos expansions properties are used to construct the MLP training cost function based on only a small amount of labeled data and a large scale of non-labeled data, which can significantly reduce the training data cost. Four numerical examples and an actual engineering problem are used to verify the effectiveness of the Deep aPCE method.



rate research

Read More

137 - Zhanlin Liu , Youngjun Choe 2021
Polynomial chaos expansions (PCEs) have been used in many real-world engineering applications to quantify how the uncertainty of an output is propagated from inputs. PCEs for models with independent inputs have been extensively explored in the literature. Recently, different approaches have been proposed for models with dependent inputs to expand the use of PCEs to more real-world applications. Typical approaches include building PCEs based on the Gram-Schmidt algorithm or transforming the dependent inputs into independent inputs. However, the two approaches have their limitations regarding computational efficiency and additional assumptions about the input distributions, respectively. In this paper, we propose a data-driven approach to build sparse PCEs for models with dependent inputs. The proposed algorithm recursively constructs orthonormal polynomials using a set of monomials based on their correlations with the output. The proposed algorithm on building sparse PCEs not only reduces the number of minimally required observations but also improves the numerical stability and computational efficiency. Four numerical examples are implemented to validate the proposed algorithm.
127 - Zhanlin Liu , Youngjun Choe 2018
Uncertainties exist in both physics-based and data-driven models. Variance-based sensitivity analysis characterizes how the variance of a model output is propagated from the model inputs. The Sobol index is one of the most widely used sensitivity indices for models with independent inputs. For models with dependent inputs, different approaches have been explored to obtain sensitivity indices in the literature. Typical approaches are based on procedures of transforming the dependent inputs into independent inputs. However, such transformation requires additional information about the inputs, such as the dependency structure or the conditional probability density functions. In this paper, data-driven sensitivity indices are proposed for models with dependent inputs. We first construct ordered partitions of linearly independent polynomials of the inputs. The modified Gram-Schmidt algorithm is then applied to the ordered partitions to generate orthogonal polynomials with respect to the empirical measure based on observed data of model inputs and outputs. Using the polynomial chaos expansion with the orthogonal polynomials, we obtain the proposed data-driven sensitivity indices. The sensitivity indices provide intuitive interpretations of how the dependent inputs affect the variance of the output without a priori knowledge on the dependence structure of the inputs. Three numerical examples are used to validate the proposed approach.
Equation learning aims to infer differential equation models from data. While a number of studies have shown that differential equation models can be successfully identified when the data are sufficiently detailed and corrupted with relatively small amounts of noise, the relationship between observation noise and uncertainty in the learned differential equation models remains unexplored. We demonstrate that for noisy data sets there exists great variation in both the structure of the learned differential equation models as well as the parameter values. We explore how to combine data sets to quantify uncertainty in the learned models, and at the same time draw mechanistic conclusions about the target differential equations. We generate noisy data using a stochastic agent-based model and combine equation learning methods with approximate Bayesian computation (ABC) to show that the correct differential equation model can be successfully learned from data, while a quantification of uncertainty is given by a posterior distribution in parameter space.
In the political decision process and control of COVID-19 (and other epidemic diseases), mathematical models play an important role. It is crucial to understand and quantify the uncertainty in models and their predictions in order to take the right decisions and trustfully communicate results and limitations. We propose to do uncertainty quantification in SIR-type models using the efficient framework of generalized Polynomial Chaos. Through two particular case studies based on Danish data for the spread of Covid-19 we demonstrate the applicability of the technique. The test cases are related to peak time estimation and superspeading and illustrate how very few model evaluations can provide insightful statistics.
We introduce PoCET: a free and open-scource Polynomial Chaos Expansion Toolbox for Matlab, featuring the automatic generation of polynomial chaos expansion (PCE) for linear and nonlinear dynamic systems with time-invariant stochastic parameters or initial conditions, as well as several simulation tools. It offers a built-in handling of Gaussian, uniform, and beta probability density functions, projection and collocation-based calculation of PCE coefficients, and the calculation of stochastic moments from a PCE. Efficient algorithms for the calculation of the involved integrals have been designed in order to increase its applicability. PoCET comes with a variety of introductory and instructive examples. Throughout the paper we show how to perform a polynomial chaos expansion on a simple ordinary differential equation using PoCET, as well as how it can be used to solve the more complex task of optimal experimental design.

suggested questions

comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا