ترغب بنشر مسار تعليمي؟ اضغط هنا

Statistical and Computational Tradeoff in Genetic Algorithm-Based Estimation

256   0   0.0 ( 0 )
 نشر من قبل Manuel Rizzo
 تاريخ النشر 2017
  مجال البحث الاحصاء الرياضي
والبحث باللغة English




اسأل ChatGPT حول البحث

When a Genetic Algorithm (GA), or a stochastic algorithm in general, is employed in a statistical problem, the obtained result is affected by both variability due to sampling, that refers to the fact that only a sample is observed, and variability due to the stochastic elements of the algorithm. This topic can be easily set in a framework of statistical and computational tradeoff question, crucial in recent problems, for which statisticians must carefully set statistical and computational part of the analysis, taking account of some resource or time constraints. In the present work we analyze estimation problems tackled by GAs, for which variability of estimates can be decomposed in the two sources of variability, considering some constraints in the form of cost functions, related to both data acquisition and runtime of the algorithm. Simulation studies will be presented to discuss the statistical and computational tradeoff question.

قيم البحث

اقرأ أيضاً

60 - Leonid Kontorovich 2009
We investigate the existence of bounded-memory consistent estimators of various statistical functionals. This question is resolved in the negative in a rather strong sense. We propose various bounded-memory approximations, using techniques from autom ata theory and stochastic processes. Some questions of potential interest are raised for future work.
69 - Keki M. Burjorjee 2009
Since the inception of genetic algorithmics the identification of computational efficiencies of the simple genetic algorithm (SGA) has been an important goal. In this paper we distinguish between a computational competency of the SGA--an efficient, b ut narrow computational ability--and a computational proficiency of the SGA--a computational ability that is both efficient and broad. Till date, attempts to deduce a computational proficiency of the SGA have been unsuccessful. It may, however, be possible to inductively infer a computational proficiency of the SGA from a set of related computational competencies that have been deduced. With this in mind we deduce two computational competencies of the SGA. These competencies, when considered together, point toward a remarkable computational proficiency of the SGA. This proficiency is pertinent to a general problem that is closely related to a well-known statistical problem at the cutting edge of computational genetics.
Recently a new algorithm for sampling posteriors of unnormalised probability densities, called ABC Shadow, was proposed in [8]. This talk introduces a global optimisation procedure based on the ABC Shadow simulation dynamics. First the general method is explained, and then results on simulated and real data are presented. The method is rather general, in the sense that it applies for probability densities that are continuously differentiable with respect to their parameters
In Statistics, log-concave density estimation is a central problem within the field of nonparametric inference under shape constraints. Despite great progress in recent years on the statistical theory of the canonical estimator, namely the log-concav e maximum likelihood estimator, adoption of this method has been hampered by the complexities of the non-smooth convex optimization problem that underpins its computation. We provide enhanced understanding of the structural properties of this optimization problem, which motivates the proposal of new algorithms, based on both randomized and Nesterov smoothing, combined with an appropriate integral discretization of increasing accuracy. We prove that these methods enjoy, both with high probability and in expectation, a convergence rate of order $1/T$ up to logarithmic factors on the objective function scale, where $T$ denotes the number of iterations. The benefits of our new computational framework are demonstrated on both synthetic and real data, and our implementation is available in a github repository texttt{LogConcComp} (Log-Concave Computation).
332 - Peter Clifford 2009
We consider the problem of approximating the empirical Shannon entropy of a high-frequency data stream under the relaxed strict-turnstile model, when space limitations make exact computation infeasible. An equivalent measure of entropy is the Renyi e ntropy that depends on a constant alpha. This quantity can be estimated efficiently and unbiasedly from a low-dimensional synopsis called an alpha-stable data sketch via the method of compressed counting. An approximation to the Shannon entropy can be obtained from the Renyi entropy by taking alpha sufficiently close to 1. However, practical guidelines for parameter calibration with respect to alpha are lacking. We avoid this problem by showing that the random variables used in estimating the Renyi entropy can be transformed to have a proper distributional limit as alpha approaches 1: the maximally skewed, strictly stable distribution with alpha = 1 defined on the entire real line. We propose a family of asymptotically unbiased log-mean estimators of the Shannon entropy, indexed by a constant zeta > 0, that can be computed in a single-pass algorithm to provide an additive approximation. We recommend the log-mean estimator with zeta = 1 that has exponentially decreasing tail bounds on the error probability, asymptotic relative efficiency of 0.932, and near-optimal computational complexity.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا