ترغب بنشر مسار تعليمي؟ اضغط هنا

Non-Gaussian inference from non-linear and non-Poisson biased distributed data

101   0   0.0 ( 0 )
 نشر من قبل Metin Ata
 تاريخ النشر 2014
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

We study the statistical inference of the cosmological dark matter density field from non-Gaussian, non-linear and non-Poisson biased distributed tracers. We have implemented a Bayesian posterior sampling computer-code solving this problem and tested it with mock data based on N-body simulations.

قيم البحث

اقرأ أيضاً

We present a Bayesian reconstruction algorithm to generate unbiased samples of the underlying dark matter field from halo catalogues. Our new contribution consists of implementing a non-Poisson likelihood including a deterministic non-linear and scal e-dependent bias. In particular we present the Hamiltonian equations of motions for the negative binomial (NB) probability distribution function. This permits us to efficiently sample the posterior distribution function of density fields given a sample of galaxies using the Hamiltonian Monte Carlo technique implemented in the Argo code. We have tested our algorithm with the Bolshoi $N$-body simulation at redshift $z = 0$, inferring the underlying dark matter density field from sub-samples of the halo catalogue with biases smaller and larger than one. Our method shows that we can draw closely unbiased samples (compatible within 1-$sigma$) from the posterior distribution up to scales of about $k$~1 h/Mpc in terms of power-spectra and cell-to-cell correlations. We find that a Poisson likelihood yields reconstructions with power spectra deviating more than 10% at $k$=0.2 h/Mpc. Our reconstruction algorithm is especially suited for emission line galaxy data for which a complex non-linear stochastic biasing treatment beyond Poissonity becomes indispensable.
347 - Benjamin D. Wandelt , 2012
We discuss two projects in non-linear cosmostatistics applicable to very large surveys of galaxies. The first is a Bayesian reconstruction of galaxy redshifts and their number density distribution from approximate, photometric redshift data. The seco nd focuses on cosmic voids and uses them to construct cosmic spheres that allow reconstructing the expansion history of the Universe using the Alcock-Paczynski test. In both cases we find that non-linearities enable the methods or enhance the results: non-linear gravitational evolution creates voids and our photo-z reconstruction works best in the highest density (and hence most non-linear) portions of our simulations.
We investigate the use of data-driven likelihoods to bypass a key assumption made in many scientific analyses, which is that the true likelihood of the data is Gaussian. In particular, we suggest using the optimization targets of flow-based generativ e models, a class of models that can capture complex distributions by transforming a simple base distribution through layers of nonlinearities. We call these flow-based likelihoods (FBL). We analyze the accuracy and precision of the reconstructed likelihoods on mock Gaussian data, and show that simply gauging the quality of samples drawn from the trained model is not a sufficient indicator that the true likelihood has been learned. We nevertheless demonstrate that the likelihood can be reconstructed to a precision equal to that of sampling error due to a finite sample size. We then apply FBLs to mock weak lensing convergence power spectra, a cosmological observable that is significantly non-Gaussian (NG). We find that the FBL captures the NG signatures in the data extremely well, while other commonly used data-driven likelihoods, such as Gaussian mixture models and independent component analysis, fail to do so. This suggests that works that have found small posterior shifts in NG data with data-driven likelihoods such as these could be underestimating the impact of non-Gaussianity in parameter constraints. By introducing a suite of tests that can capture different levels of NG in the data, we show that the success or failure of traditional data-driven likelihoods can be tied back to the structure of the NG in the data. Unlike other methods, the flexibility of the FBL makes it successful at tackling different types of NG simultaneously. Because of this, and consequently their likely applicability across datasets and domains, we encourage their use for inference when sufficient mock data are available for training.
We show that in cases of marginal detections (~ 3sigma), such as that of Baryonic Acoustic Oscillations (BAO) in cosmology, the often-used Gaussian approximation to the full likelihood is very poor, especially beyond ~3sigma. This can radically alter confidence intervals on parameters and implies that one cannot naively extrapolate 1sigma-errorbars to 3sigma, and beyond. We propose a simple fitting formula which corrects for this effect in posterior probabilities arising from marginal detections. Alternatively the full likelihood should be used for parameter estimation rather than the Gaussian approximation of a just mean and an error.
200 - Joyce Byun , Rachel Bean 2013
A detection of primordial non-Gaussianity could transform our understanding of the fundamental theory of inflation. The precision promised by upcoming CMB and large-scale structure surveys raises a natural question: if a detection given a particular template is made, what does this truly tell us about the underlying theory? In this paper we present a systematic way to constrain a wide range of non-Gaussian shapes, including general single and multi-field models and models with excited initial states. We present a separable, divergent basis able to recreate many shapes in the literature to high accuracy with between three and seven basis functions. The basis allows shapes to be grouped into broad template classes, satisfying theoretically-relevant priors on their divergence properties in the squeezed limit. We forecast how well a Planck-like CMB survey could not only detect a general non-Gaussian signal but discern more about its shape, using existing templates and new ones we propose. This approach offers an opportunity to tie together minimal theoretical priors with observational constraints on the shape in general, and in the squeezed limit, to gain a deeper insight into what drove inflation.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا