ترغب بنشر مسار تعليمي؟ اضغط هنا

Low-dimensional offshore wave input for extreme event quantification

68   0   0.0 ( 0 )
 نشر من قبل Kenan \\v{S}ehi\\'c
 تاريخ النشر 2020
  مجال البحث فيزياء
والبحث باللغة English
 تأليف Kenan v{S}ehic




اسأل ChatGPT حول البحث

In offshore engineering design, nonlinear wave models are often used to propagate stochastic waves from an input boundary to the location of an offshore structure. Each wave realization is typically characterized by a high-dimensional input time series, and a reliable determination of the extreme events is associated with substantial computational effort. As the sea depth decreases, extreme events become more difficult to evaluate. We here construct a low-dimensional characterization of the candidate input time series to circumvent the search for extreme wave events in a high-dimensional input probability space. Each wave input is represented by a unique low-dimensional set of parameters for which standard surrogate approximations, such as Gaussian processes, can estimate the short-term exceedance probability efficiently and accurately. We demonstrate the advantages of the new approach with a simple shallow-water wave model based on the Korteweg-de Vries equation for which we can provide an accurate reference solution based on the simple Monte Carlo method. We furthermore apply the method to a fully nonlinear wave model for wave propagation over a sloping seabed. The results demonstrate that the Gaussian process can learn accurately the tail of the heavy-tailed distribution of the maximum wave crest elevation based on only $1.7%$ of the required Monte Carlo evaluations.


قيم البحث

اقرأ أيضاً

95 - Linyu Lin , Nam Dinh 2020
In nuclear engineering, modeling and simulations (M&Ss) are widely applied to support risk-informed safety analysis. Since nuclear safety analysis has important implications, a convincing validation process is needed to assess simulation adequacy, i. e., the degree to which M&S tools can adequately represent the system quantities of interest. However, due to data gaps, validation becomes a decision-making process under uncertainties. Expert knowledge and judgments are required to collect, choose, characterize, and integrate evidence toward the final adequacy decision. However, in validation frameworks CSAU: Code Scaling, Applicability, and Uncertainty (NUREG/CR-5249) and EMDAP: Evaluation Model Development and Assessment Process (RG 1.203), such a decision-making process is largely implicit and obscure. When scenarios are complex, knowledge biases and unreliable judgments can be overlooked, which could increase uncertainty in the simulation adequacy result and the corresponding risks. Therefore, a framework is required to formalize the decision-making process for simulation adequacy in a practical, transparent, and consistent manner. This paper suggests a framework Predictive Capability Maturity Quantification using Bayesian network (PCMQBN) as a quantified framework for assessing simulation adequacy based on information collected from validation activities. A case study is prepared for evaluating the adequacy of a Smoothed Particle Hydrodynamic simulation in predicting the hydrodynamic forces onto static structures during an external flooding scenario. Comparing to the qualitative and implicit adequacy assessment, PCMQBN is able to improve confidence in the simulation adequacy result and to reduce expected loss in the risk-informed safety analysis.
Projections of extreme sea levels (ESLs) are critical for managing coastal risks, but are made complicated by deep uncertainties. One key uncertainty is the choice of model structure used to estimate coastal hazards. Differences in model structural c hoices contribute to uncertainty in estimated coastal hazard, so it is important to characterize how model structural choice affects estimates of ESL. Here, we present a collection of 36 ESL data sets, from tide gauge stations along the United States East and Gulf Coasts. The data are processed using both annual block maxima and peaks-over-thresholds approaches for modeling distributions of extremes. We use these data sets to fit a suite of potentially nonstationary extreme value models by covarying the ESL statistics with multiple climate variables. We demonstrate how this data set enables inquiry into deep uncertainty surrounding coastal hazards. For all of the models and sites considered here, we find that accounting for changes in the frequency of coastal extreme sea levels provides a better fit than using a stationary extreme value model.
Signal estimation in the presence of background noise is a common problem in several scientific disciplines. An On/Off measurement is performed when the background itself is not known, being estimated from a background control sample. The frequentist and Bayesian approaches for signal estimation in On/Off measurements are reviewed and compared, focusing on the weakness of the former and on the advantages of the latter in correctly addressing the Poissonian nature of the problem. In this work, we devise a novel reconstruction method, dubbed BASiL (Bayesian Analysis including Single-event Likelihoods), for estimating the signal rate based on the Bayesian formalism. It uses information on event-by-event individual parameters and their distribution for the signal and background population. Events are thereby weighted according to their likelihood of being a signal or a background event and background suppression can be achieved without performing fixed fiducial cuts. Throughout the work, we maintain a general notation, that allows to apply the method generically, and provide a performance test using real data and simulations of observations with the MAGIC telescopes, as demonstration of the performance for Cherenkov telescopes. BASiL allows to estimate the signal more precisely, avoiding loss of exposure due to signal extraction cuts. We expect its applicability to be straightforward in similar cases.
Oscillation probability calculations are becoming increasingly CPU intensive in modern neutrino oscillation analyses. The independency of reweighting individual events in a Monte Carlo sample lends itself to parallel implementation on a Graphics Proc essing Unit. The library Prob3++ was ported to the GPU using the CUDA C API, allowing for large scale parallelized calculations of neutrino oscillation probabilities through matter of constant density, decreasing the execution time by a factor of 75, when compared to performance on a single CPU.
We provide a method to correct the observed azimuthal anisotropy in heavy-ion collisions for the event plane resolution in a wide centrality bin. This new procedure is especially useful for rare particles, such as Omega baryons and J/psi mesons, whic h are difficult to measure in small intervals of centrality. Based on a Monte Carlo calculation with simulated v_2 and multiplicity, we show that some of the commonly used methods have a bias of up to 15%.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا