ترغب بنشر مسار تعليمي؟ اضغط هنا

Galton-Watson process and bayesian inference: A turnkey method for the viability study of small populations

57   0   0.0 ( 0 )
 نشر من قبل Bertrand Cloez
 تاريخ النشر 2019
  مجال البحث الاحصاء الرياضي
والبحث باللغة English
 تأليف B Cloez




اسأل ChatGPT حول البحث

1 Sharp prediction of extinction times is needed in biodiversity monitoring and conservation management. 2 The Galton-Watson process is a classical stochastic model for describing population dynamics. Its evolution is like the matrix population model where offspring numbers are random. Extinction probability, extinction time, abundance are well known and given by explicit formulas. In contrast with the deterministic model, it can be applied to small populations. 3 Parameters of this model can be estimated through the Bayesian inference framework. This enables to consider non-arbitrary scenarios. 4 We show how coupling Bayesian inference with the Galton-Watson model provides several features: i) a flexible modelling approach with easily understandable parameters ii) compatibility with the classical matrix population model (Leslie type model) iii) A non-computational approach which then leads to more information with less computing iv) a non-arbitrary choice for scenarios, parameters... It can be seen to go one step further than the classical matrix population model for the viability problem. 5 To illustrate these features, we provide analysis details for two examples whose one of which is a real life example.



قيم البحث

اقرأ أيضاً

The key to our investigation is an improved (and in a sense sharp) understanding of the survival time of the contact process on star graphs. Using these results, we show that for the contact process on Galton-Watson trees, when the offspring distribu tion (i) is subexponential the critical value for local survival $lambda_2=0$ and (ii) when it is geometric($p$) we have $lambda_2 le C_p$, where the $C_p$ are much smaller than previous estimates. We also study the critical value $lambda_c(n)$ for prolonged persistence on graphs with $n$ vertices generated by the configuration model. In the case of power law and stretched exponential distributions where it is known $lambda_c(n) to 0$ we give estimates on the rate of convergence. Physicists tell us that $lambda_c(n) sim 1/Lambda(n)$ where $Lambda(n)$ is the maximum eigenvalue of the adjacency matrix. Our results show that this is not correct.
112 - Xiequan Fan , Qi-Man Shao 2021
Let $(Z_n)_{ngeq0}$ be a supercritical Galton-Watson process. Consider the Lotka-Nagaev estimator for the offspring mean. In this paper, we establish self-normalized Cram{e}r type moderate deviations and Berry-Esseens bounds for the Lotka-Nagaev esti mator. The results are believed to be optimal or near optimal.
We first introduce a novel profile-based alignment algorithm, the multiple continuous Signal Alignment algorithm with Gaussian Process Regression profiles (SA-GPR). SA-GPR addresses the limitations of currently available signal alignment methods by a dopting a hybrid of the particle smoothing and Markov-chain Monte Carlo (MCMC) algorithms to align signals, and by applying the Gaussian process regression to construct profiles to be aligned continuously. SA-GPR shares all the strengths of the existing alignment algorithms that depend on profiles but is more exact in the sense that profiles do not need to be discretized as sequential bins. The uncertainty of performance over the resolution of such bins is thereby eliminated. This methodology produces alignments that are consistent, that regularize extreme cases, and that properly reflect the inherent uncertainty. Then we extend SA-GPR to a specific problem in the field of paleoceanography with a method called Bayesian Inference Gaussian Process Multiproxy Alignment of Continuous Signals (BIGMACS). The goal of BIGMACS is to infer continuous ages for ocean sediment cores using two classes of age proxies: proxies that explicitly return calendar ages (e.g., radiocarbon) and those used to synchronize ages in multiple marine records (e.g., an oxygen isotope based marine proxy known as benthic ${delta}^{18}{rm O}$). BIGMACS integrates these two proxies by iteratively performing two steps: profile construction from benthic ${delta}^{18}{rm O}$ age models and alignment of each core to the profile also reflecting radiocarbon dates. We use BIGMACS to construct a new Deep Northeastern Atlantic stack (i.e., a profile from a particular benthic ${delta}^{18}{rm O}$ records) of five ocean sediment cores. We conclude by constructing multiproxy age models for two additional cores from the same region by aligning them to the stack.
We are concerned with exploring the probabilities of first order statements for Galton-Watson trees with $Poisson(c)$ offspring distribution. Fixing a positive integer $k$, we exploit the $k$-move Ehrenfeucht game on rooted trees for this purpose. Le t $Sigma$, indexed by $1 leq j leq m$, denote the finite set of equivalence classes arising out of this game, and $D$ the set of all probability distributions over $Sigma$. Let $x_{j}(c)$ denote the true probability of the class $j in Sigma$ under $Poisson(c)$ regime, and $vec{x}(c)$ the true probability vector over all the equivalence classes. Then we are able to define a natural recursion function $Gamma$, and a map $Psi = Psi_{c}: D rightarrow D$ such that $vec{x}(c)$ is a fixed point of $Psi_{c}$, and starting with any distribution $vec{x} in D$, we converge to this fixed point via $Psi$ because it is a contraction. We show this both for $c leq 1$ and $c > 1$, though the techniques for these two ranges are quite different.
We consider Bayesian inference for stochastic differential equation mixed effects models (SDEMEMs) exemplifying tumor response to treatment and regrowth in mice. We produce an extensive study on how a SDEMEM can be fitted using both exact inference b ased on pseudo-marginal MCMC and approximate inference via Bayesian synthetic likelihoods (BSL). We investigate a two-compartments SDEMEM, these corresponding to the fractions of tumor cells killed by and survived to a treatment, respectively. Case study data considers a tumor xenography study with two treatment groups and one control, each containing 5-8 mice. Results from the case study and from simulations indicate that the SDEMEM is able to reproduce the observed growth patterns and that BSL is a robust tool for inference in SDEMEMs. Finally, we compare the fit of the SDEMEM to a similar ordinary differential equation model. Due to small sample sizes, strong prior information is needed to identify all model parameters in the SDEMEM and it cannot be determined which of the two models is the better in terms of predicting tumor growth curves. In a simulation study we find that with a sample of 17 mice per group BSL is able to identify all model parameters and distinguish treatment groups.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا