Do you want to publish a course? Click here

Bayesian inference of cosmic density fields from non-linear, scale-dependent, and stochastic biased tracers

202   0   0.0 ( 0 )
 Added by Metin Ata
 Publication date 2014
  fields Physics
and research's language is English




Ask ChatGPT about the research

We present a Bayesian reconstruction algorithm to generate unbiased samples of the underlying dark matter field from halo catalogues. Our new contribution consists of implementing a non-Poisson likelihood including a deterministic non-linear and scale-dependent bias. In particular we present the Hamiltonian equations of motions for the negative binomial (NB) probability distribution function. This permits us to efficiently sample the posterior distribution function of density fields given a sample of galaxies using the Hamiltonian Monte Carlo technique implemented in the Argo code. We have tested our algorithm with the Bolshoi $N$-body simulation at redshift $z = 0$, inferring the underlying dark matter density field from sub-samples of the halo catalogue with biases smaller and larger than one. Our method shows that we can draw closely unbiased samples (compatible within 1-$sigma$) from the posterior distribution up to scales of about $k$~1 h/Mpc in terms of power-spectra and cell-to-cell correlations. We find that a Poisson likelihood yields reconstructions with power spectra deviating more than 10% at $k$=0.2 h/Mpc. Our reconstruction algorithm is especially suited for emission line galaxy data for which a complex non-linear stochastic biasing treatment beyond Poissonity becomes indispensable.



rate research

Read More

We study the statistical inference of the cosmological dark matter density field from non-Gaussian, non-linear and non-Poisson biased distributed tracers. We have implemented a Bayesian posterior sampling computer-code solving this problem and tested it with mock data based on N-body simulations.
Large-scale Fourier modes of the cosmic density field are of great value for learning about cosmology because of their well-understood relationship to fluctuations in the early universe. However, cosmic variance generally limits the statistical precision that can be achieved when constraining model parameters using these modes as measured in galaxy surveys, and moreover, these modes are sometimes inaccessible due to observational systematics or foregrounds. For some applications, both limitations can be circumvented by reconstructing large-scale modes using the correlations they induce between smaller-scale modes of an observed tracer (such as galaxy positions). In this paper, we further develop a formalism for this reconstruction, using a quadratic estimator similar to the one used for lensing of the cosmic microwave background. We incorporate nonlinearities from gravity, nonlinear biasing, and local-type primordial non-Gaussianity, and verify that the estimator gives the expected results when applied to N-body simulations. We then carry out forecasts for several upcoming surveys, demonstrating that, when reconstructed modes are included alongside directly-observed tracer density modes, constraints on local primordial non-Gaussianity are generically tightened by tens of percents compared to standard single-tracer analyses. In certain cases, these improvements arise from cosmic variance cancellation, with reconstructed modes taking the place of modes of a separate tracer, thus enabling an effective multitracer approach with single-tracer observations.
We present a self-consistent Bayesian formalism to sample the primordial density fields compatible with a set of dark matter density tracers after cosmic evolution observed in redshift space. Previous works on density reconstruction did not self-consistently consider redshift space distortions or included an additional iterative distortion correction step. We present here the analytic solution of coherent flows within a Hamiltonian Monte Carlo posterior sampling of the primordial density field. We test our method within the Zeldovich approximation, presenting also an analytic solution including tidal fields and spherical collapse on small scales using augmented Lagrangian perturbation theory. Our resulting reconstructed fields are isotropic and their power spectra are unbiased compared to the true one defined by our mock observations. Novel algorithmic implementations are introduced regarding the mass assignment kernels when defining the dark matter density field and optimization of the time step in the Hamiltonian equations of motions. Our algorithm, dubbed barcode, promises to be specially suited for analysis of the dark matter cosmic web down to scales of a few Megaparsecs. This large scale structure is implied by the observed spatial distribution of galaxy clusters --- such as obtained from X-ray, SZ or weak lensing surveys --- as well as that of the intergalactic medium sampled by the Lyman alpha forest or perhaps even by deep hydrogen intensity mapping. In these cases, virialized motions are negligible, and the tracers cannot be modeled as point-like objects. It could be used in all of these contexts as a baryon acoustic oscillation reconstruction algorithm.
The advent of accessible ancient DNA technology now allows the direct ascertainment of allele frequencies in ancestral populations, thereby enabling the use of allele frequency time series to detect and estimate natural selection. Such direct observations of allele frequency dynamics are expected to be more powerful than inferences made using patterns of linked neutral variation obtained from modern individuals. We develop a Bayesian method to make use of allele frequency time series data and infer the parameters of general diploid selection, along with allele age, in non-equilibrium populations. We introduce a novel path augmentation approach, in which we use Markov chain Monte Carlo to integrate over the space of allele frequency trajectories consistent with the observed data. Using simulations, we show that this approach has good power to estimate selection coefficients and allele age. Moreover, when applying our approach to data on horse coat color, we find that ignoring a relevant demographic history can significantly bias the results of inference. Our approach is made available in a C++ software package.
In the context of a high-dimensional linear regression model, we propose the use of an empirical correlation-adaptive prior that makes use of information in the observed predictor variable matrix to adaptively address high collinearity, determining if parameters associated with correlated predictors should be shrunk together or kept apart. Under suitable conditions, we prove that this empirical Bayes posterior concentrates around the true sparse parameter at the optimal rate asymptotically. A simplified version of a shotgun stochastic search algorithm is employed to implement the variable selection procedure, and we show, via simulation experiments across different settings and a real-data application, the favorable performance of the proposed method compared to existing methods.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا