ترغب بنشر مسار تعليمي؟ اضغط هنا

Stochastic Volatily Models using Hamiltonian Monte Carlo Methods and Stan

51   0   0.0 ( 0 )
 نشر من قبل Ricardo Ehlers
 تاريخ النشر 2017
  مجال البحث الاحصاء الرياضي
والبحث باللغة English




اسأل ChatGPT حول البحث

This paper presents a study using the Bayesian approach in stochastic volatility models for modeling financial time series, using Hamiltonian Monte Carlo methods (HMC). We propose the use of other distributions for the errors in the observation equation of stochastic volatility models, besides the Gaussian distribution, to address problems as heavy tails and asymmetry in the returns. Moreover, we use recently developed information criteria WAIC and LOO that approximate the cross-validation methodology, to perform the selection of models. Throughout this work, we study the quality of the HMC methods through examples, simulation studies and applications to real data sets.

قيم البحث

اقرأ أيضاً

In this paper, we develop Bayesian Hamiltonian Monte Carlo methods for inference in asymmetric GARCH models under different distributions for the error term. We implemented Zero-variance and Hamiltonian Monte Carlo schemes for parameter estimation to try and reduce the standard errors of the estimates thus obtaing more efficient results at the price of a small extra computational cost.
Deep Gaussian Processes (DGPs) are hierarchical generalizations of Gaussian Processes that combine well calibrated uncertainty estimates with the high flexibility of multilayer models. One of the biggest challenges with these models is that exact inf erence is intractable. The current state-of-the-art inference method, Variational Inference (VI), employs a Gaussian approximation to the posterior distribution. This can be a potentially poor unimodal approximation of the generally multimodal posterior. In this work, we provide evidence for the non-Gaussian nature of the posterior and we apply the Stochastic Gradient Hamiltonian Monte Carlo method to generate samples. To efficiently optimize the hyperparameters, we introduce the Moving Window MCEM algorithm. This results in significantly better predictions at a lower computational cost than its VI counterpart. Thus our method establishes a new state-of-the-art for inference in DGPs.
This paper studies a non-random-walk Markov Chain Monte Carlo method, namely the Hamiltonian Monte Carlo (HMC) method in the context of Subset Simulation used for structural reliability analysis. The HMC method relies on a deterministic mechanism ins pired by Hamiltonian dynamics to propose samples following a target probability distribution. The method alleviates the random walk behavior to achieve a more effective and consistent exploration of the probability space compared to standard Gibbs or Metropolis-Hastings techniques. After a brief review of the basic concepts of the HMC method and its computational details, two algorithms are proposed to facilitate the application of the HMC method to Subset Simulation in structural reliability analysis. Next, the behavior of the two HMC algorithms is illustrated using simple probability distribution models. Finally, the accuracy and efficiency of Subset Simulation employing the two HMC algorithms are tested using various reliability examples. The supporting source code and data are available for download at (the URL that will become available once the paper is accepted).
Probabilistic programming uses programs to express generative models whose posterior probability is then computed by built-in inference engines. A challenging goal is to develop general purpose inference algorithms that work out-of-the-box for arbitr ary programs in a universal probabilistic programming language (PPL). The densities defined by such programs, which may use stochastic branching and recursion, are (in general) nonparametric, in the sense that they correspond to models on an infinite-dimensional parameter space. However standard inference algorithms, such as the Hamiltonian Monte Carlo (HMC) algorithm, target distributions with a fixed number of parameters. This paper introduces the Nonparametric Hamiltonian Monte Carlo (NP-HMC) algorithm which generalises HMC to nonparametric models. Inputs to NP-HMC are a new class of measurable functions called tree representable, which serve as a language-independent representation of the density functions of probabilistic programs in a universal PPL. We provide a correctness proof of NP-HMC, and empirically demonstrate significant performance improvements over existing approaches on several nonparametric examples.
207 - Tore Selland Kleppe 2018
Dynamically rescaled Hamiltonian Monte Carlo (DRHMC) is introduced as a computationally fast and easily implemented method for performing full Bayesian analysis in hierarchical statistical models. The method relies on introducing a modified parameter isation so that the re-parameterised target distribution has close to constant scaling properties, and thus is easily sampled using standard (Euclidian metric) Hamiltonian Monte Carlo. Provided that the parameterisations of the conditional distributions specifying the hierarchical model are constant information parameterisations (CIP), the relation between the modified- and original parameterisation is bijective, explicitly computed and admit exploitation of sparsity in the numerical linear algebra involved. CIPs for a large catalogue of statistical models are presented, and from the catalogue, it is clear that many CIPs are currently routinely used in statistical computing. A relation between the proposed methodology and a class of explicitly integrated Riemann manifold Hamiltonian Monte Carlo methods is discussed. The methodology is illustrated on several example models, including a model for inflation rates with multiple levels of non-linearly dependent latent variables.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا