ترغب بنشر مسار تعليمي؟ اضغط هنا

Pixel-Level Statistical Analyses of Prescribed Fire Spread

182   0   0.0 ( 0 )
 نشر من قبل Bryan Quaife
 تاريخ النشر 2017
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Wildland fire dynamics is a complex turbulent dimensional process. Cellular automata (CA) is an efficient tool to predict fire dynamics, but the main parameters of the method are challenging to estimate. To overcome this challenge, we compute statistical distributions of the key parameters of a CA model using infrared images from controlled burns. Moreover, we apply this analysis to different spatial scales and compare the experimental results to a simple statistical model. By performing this analysis and making this comparison, several capabilities and limitations of CA are revealed.


قيم البحث

اقرأ أيضاً

Global sensitivity analysis (GSA) of numerical simulators aims at studying the global impact of the input uncertainties on the output. To perform the GSA, statistical tools based on inputs/output dependence measures are commonly used. We focus here o n dependence measures based on reproducing kernel Hilbert spaces: the Hilbert-Schmidt Independence Criterion denoted HSIC. Sometimes, the probability distributions modeling the uncertainty of inputs may be themselves uncertain and it is important to quantify the global impact of this uncertainty on GSA results. We call it here the second-level global sensitivity analysis (GSA2). However, GSA2, when performed with a double Monte Carlo loop, requires a large number of model evaluations which is intractable with CPU time expensive simulators. To cope with this limitation, we propose a new statistical methodology based on a single Monte Carlo loop with a limited calculation budget. Firstly, we build a unique sample of inputs from a well chosen probability distribution and the associated code outputs are computed. From this inputs/output sample, we perform GSA for various assumed probability distributions of inputs by using weighted HSIC measures estimators. Statistical properties of these weighted esti-mators are demonstrated. Finally, we define 2 nd-level HSIC-based measures between the probability distributions of inputs and GSA results, which constitute GSA2 indices. The efficiency of our GSA2 methodology is illustrated on an analytical example, thereby comparing several technical options. Finally, an application to a test case simulating a severe accidental scenario on nuclear reactor is provided.
Assimilation of data into a fire-spread model is formulated as an optimization problem. The level set equation, which relates the fire arrival time and the rate of spread, is allowed to be satisfied only approximately, and we minimize a norm of the r esidual. Previous methods based on modification of the fire arrival time either used an additive correction to the fire arrival time, or made a position correction. Unlike additive fire arrival time corrections, the new method respects the dependence of the fire rate of spread on diurnal changes of fuel moisture and on weather changes, and, unlike position corrections, it respects the dependence of the fire spread on fuels and terrain as well. The method is used to interpolate the fire arrival time between two perimeters by imposing the fire arrival time at the perimeters as constraints.
129 - Louis Lyons 2016
Various statistical issues relevant to searches for new physics or to parameter determination in analyses of data in neutrino experiments are briefly discussed.
The transmission of vector infectious diseases, which produces complex spatiotemporal patterns, is analyzed by a periodically forced two-dimensional cellular automata model. The system, which comprises three population levels, is introduced to descri be complex features of the dynamics of the vector transmitted dengue epidemics, known to be very sensitive to seasonal variables. The three coupled levels represent the human, the adult and immature vector populations. The dynamics includes external seasonality forcing (rainfall intensity data), human and mosquito mobility, and vector control effects. The model parameters, even if bounded to well defined intervals obtained from reported data, can be selected to reproduce specific epidemic outbursts. In the current study, explicit results are obtained by comparison with actual data retrieved from the time-series of dengue epidemics in two cities in Brazil. The results show fluctuations that are not captured by mean-field models. It also reveals the qualitative behavior of the spatiotemporal patterns of the epidemics. In the extreme situation of absence of external periodic drive, the model predicts completely distinct long time evolution. The model is robust in the sense that it is able to reproduce the time series of dengue epidemics of different cities, provided the forcing term takes into account the local rainfall modulation. Finally, the dependence between epidemics threshold and vector control undergoes a transition from power law to stretched exponential behavior due to human mobility effect.
There has recently been considerable interest in addressing the problem of unifying distributed statistical analyses into a single coherent inference. This problem naturally arises in a number of situations, including in big-data settings, when worki ng under privacy constraints, and in Bayesian model choice. The majority of existing approaches have relied upon convenient approximations of the distributed analyses. Although typically being computationally efficient, and readily scaling with respect to the number of analyses being unified, approximate approaches can have significant shortcomings -- the quality of the inference can degrade rapidly with the number of analyses being unified, and can be substantially biased even when unifying a small number of analyses that do not concur. In contrast, the recent Fusion approach of Dai et al. (2019) is a rejection sampling scheme which is readily parallelisable and is exact (avoiding any form of approximation other than Monte Carlo error), albeit limited in applicability to unifying a small number of low-dimensional analyses. In this paper we introduce a practical Bayesian Fusion approach. We extend the theory underpinning the Fusion methodology and, by embedding it within a sequential Monte Carlo algorithm, we are able to recover the correct target distribution. By means of extensive guidance on the implementation of the approach, we demonstrate theoretically and empirically that Bayesian Fusion is robust to increasing numbers of analyses, and coherently unifying analyses which do not concur. This is achieved while being computationally competitive with approximate schemes.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا