Do you want to publish a course? Click here

Bayesian parameter estimation of miss-specified models

165   0   0.0 ( 0 )
 Publication date 2018
  fields Physics
and research's language is English




Ask ChatGPT about the research

Fitting a simplifying model with several parameters to real data of complex objects is a highly nontrivial task, but enables the possibility to get insights into the objects physics. Here, we present a method to infer the parameters of the model, the model error as well as the statistics of the model error. This method relies on the usage of many data sets in a simultaneous analysis in order to overcome the problems caused by the degeneracy between model parameters and model error. Errors in the modeling of the measurement instrument can be absorbed in the model error allowing for applications with complex instruments.



rate research

Read More

Using the latest numerical simulations of rotating stellar core collapse, we present a Bayesian framework to extract the physical information encoded in noisy gravitational wave signals. We fit Bayesian principal component regression models with known and unknown signal arrival times to reconstruct gravitational wave signals, and subsequently fit known astrophysical parameters on the posterior means of the principal component coefficients using a linear model. We predict the ratio of rotational kinetic energy to gravitational energy of the inner core at bounce by sampling from the posterior predictive distribution, and find that these predictions are generally very close to the true parameter values, with $90%$ credible intervals $sim 0.04$ and $sim 0.06$ wide for the known and unknown arrival time models respectively. Two supervised machine learning methods are implemented to classify precollapse differential rotation, and we find that these methods discriminate rapidly rotating progenitors particularly well. We also introduce a constrained optimization approach to model selection to find an optimal number of principal components in the signal reconstruction step. Using this approach, we select 14 principal components as the most parsimonious model.
LISA is the upcoming space-based Gravitational Wave telescope. LISA Pathfinder, to be launched in the coming years, will prove and verify the detection principle of the fundamental Doppler link of LISA on a flight hardware identical in design to that of LISA. LISA Pathfinder will collect a picture of all noise disturbances possibly affecting LISA, achieving the unprecedented pureness of geodesic motion necessary for the detection of gravitational waves. The first steps of both missions will crucially depend on a very precise calibration of the key system parameters. Moreover, robust parameters estimation is of fundamental importance in the correct assessment of the residual force noise, an essential part of the data processing for LISA. In this paper we present a maximum likelihood parameter estimation technique in time domain being devised for this calibration and show its proficiency on simulated data and validation through Monte Carlo realizations of independent noise runs. We discuss its robustness to non-standard scenarios possibly arising during the real-life mission, as well as its independence to the initial guess and non-gaussianities. Furthermore, we apply the same technique to data produced in mission-like fashion during operational exercises with a realistic simulator provided by ESA.
98 - Satoru Tokuda , Kenji Nagata , 2016
The heuristic identification of peaks from noisy complex spectra often leads to misunderstanding of the physical and chemical properties of matter. In this paper, we propose a framework based on Bayesian inference, which enables us to separate multipeak spectra into single peaks statistically and consists of two steps. The first step is estimating both the noise variance and the number of peaks as hyperparameters based on Bayes free energy, which generally is not analytically tractable. The second step is fitting the parameters of each peak function to the given spectrum by calculating the posterior density, which has a problem of local minima and saddles since multipeak models are nonlinear and hierarchical. Our framework enables the escape from local minima or saddles by using the exchange Monte Carlo method and calculates Bayes free energy via the multiple histogram method. We discuss a simulation demonstrating how efficient our framework is and show that estimating both the noise variance and the number of peaks prevents overfitting, overpenalizing, and misunderstanding the precision of parameter estimation.
We present a Bayesian estimation analysis for a particular trace gas detection technique with species separation provided by differential diffusion. The proposed method collects a sample containing multiple gas species into a common volume, and then allows it to diffuse across a linear array of optical absorption detectors, using, for example, high-finesse Fabry-Perot cavities. The estimation procedure assumes that all gas parameters (e.g. diffusion constants, optical cross sections) are known except for the number population of each species, which are determined from the time-of-flight absorption profiles in each detector.
Scientists use mathematical modelling to understand and predict the properties of complex physical systems. In highly parameterised models there often exist relationships between parameters over which model predictions are identical, or nearly so. These are known as structural or practical unidentifiabilities, respectively. They are hard to diagnose and make reliable parameter estimation from data impossible. They furthermore imply the existence of an underlying model simplification. We describe a scalable method for detecting unidentifiabilities, and the functional relations defining them, for generic models. This allows for model simplification, and appreciation of which parameters (or functions thereof) cannot be estimated from data. Our algorithm can identify features such as redundant mechanisms and fast timescale subsystems, as well as the regimes in which such approximations are valid. We base our algorithm on a novel quantification of regional parametric sensitivity: multiscale sloppiness. Traditionally, the link between parametric sensitivity and the conditioning of the parameter estimation problem is made locally, through the Fisher Information Matrix. This is valid in the regime of infinitesimal measurement uncertainty. We demonstrate the duality between multiscale sloppiness and the geometry of confidence regions surrounding parameter estimates made where measurement uncertainty is non-negligible. Further theoretical relationships are provided linking multiscale sloppiness to the Likelihood-ratio test. From this, we show that a local sensitivity analysis (as typically done) is insufficient for determining the reliability of parameter estimation, even with simple (non)linear systems. Our algorithm provides a tractable alternative. We finally apply our methods to a large-scale, benchmark Systems Biology model of NF-$kappa$B, uncovering previously unknown unidentifiabilities.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا