ترغب بنشر مسار تعليمي؟ اضغط هنا

Halo detection via large-scale Bayesian inference

138   0   0.0 ( 0 )
 نشر من قبل Alex Merson
 تاريخ النشر 2015
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

We present a proof-of-concept of a novel and fully Bayesian methodology designed to detect halos of different masses in cosmological observations subject to noise and systematic uncertainties. Our methodology combines the previously published Bayesian large-scale structure inference algorithm, HADES, and a Bayesian chain rule (the Blackwell-Rao Estimator), which we use to connect the inferred density field to the properties of dark matter halos. To demonstrate the capability of our approach we construct a realistic galaxy mock catalogue emulating the wide-area 6-degree Field Galaxy Survey, which has a median redshift of approximately 0.05. Application of HADES to the catalogue provides us with accurately inferred three-dimensional density fields and corresponding quantification of uncertainties inherent to any cosmological observation. We then use a cosmological simulation to relate the amplitude of the density field to the probability of detecting a halo with mass above a specified threshold. With this information we can sum over the HADES density field realisations to construct maps of detection probabilities and demonstrate the validity of this approach within our mock scenario. We find that the probability of successful of detection of halos in the mock catalogue increases as a function of the signal-to-noise of the local galaxy observations. Our proposed methodology can easily be extended to account for more complex scientific questions and is a promising novel tool to analyse the cosmic large-scale structure in observations.



قيم البحث

اقرأ أيضاً

Strong lensing is a sensitive probe of the small-scale density fluctuations in the Universe. We implement a novel approach to modeling strongly lensed systems using probabilistic cataloging, which is a transdimensional, hierarchical, and Bayesian fra mework to sample from a metamodel (union of models with different dimensionality) consistent with observed photon count maps. Probabilistic cataloging allows us to robustly characterize modeling covariances within and across lens models with different numbers of subhalos. Unlike traditional cataloging of subhalos, it does not require model subhalos to improve the goodness of fit above the detection threshold. Instead, it allows the exploitation of all information contained in the photon count maps, for instance, when constraining the subhalo mass function. We further show that, by not including these small subhalos in the lens model, fixed-dimensional inference methods can significantly mismodel the data. Using a simulated Hubble Space Telescope (HST) dataset, we show that the subhalo mass function can be probed even when many subhalos in the sample catalogs are individually below the detection threshold and would be absent in a traditional catalog. With the planned Wide Field Infrared Space Telescope (WFIRST), simultaneous probabilistic cataloging of dark subhalos in high-resolution, deep strong lens images has the potential to constrain the subhalo mass function at even lower masses.
Analyzes of next-generation galaxy data require accurate treatment of systematic effects such as the bias between observed galaxies and the underlying matter density field. However, proposed models of the phenomenon are either numerically expensive o r too inaccurate to achieve unbiased inferences of cosmological parameters even at mildly-nonlinear scales of the data. As an alternative to constructing accurate galaxy bias models, requiring understanding galaxy formation, we propose to construct likelihood distributions for Bayesian forward modeling approaches that are insensitive to linear, scale-dependent bias and provide robustness against model misspecification. We use maximum entropy arguments to construct likelihood distributions designed to account only for correlations between data and inferred quantities. By design these correlations are insensitive to linear galaxy biasing relations, providing the desired robustness. The method is implemented and tested within a Markov Chain Monte Carlo approach. The method is assessed using a halo mock catalog based on standard full, cosmological, N-body simulations. We obtain unbiased and tight constraints on cosmological parameters exploiting only linear cross-correlation rates for $kle 0.10$ Mpc/h. Tests for halos of masses ~10$^{12}$ M$_odot$ to ~10$^{13}$ M$_odot$ indicate that it is possible to ignore all details of the linear, scale dependent, bias function while obtaining robust constraints on cosmology. Our results provide a promising path forward to analyzes of galaxy surveys without the requirement of having to accurately model the details of galaxy biasing but by designing robust likelihoods for the inference.
We present a large-scale Bayesian inference framework to constrain cosmological parameters using galaxy redshift surveys, via an application of the Alcock-Paczynski (AP) test. Our physical model of the non-linearly evolved density field, as probed by galaxy surveys, employs Lagrangian perturbation theory (LPT) to connect Gaussian initial conditions to the final density field, followed by a coordinate transformation to obtain the redshift space representation for comparison with data. We generate realizations of primordial and present-day matter fluctuations given a set of observations. This hierarchical approach encodes a novel AP test, extracting several orders of magnitude more information from the cosmological expansion compared to classical approaches, to infer cosmological parameters and jointly reconstruct the underlying 3D dark matter density field. The novelty of this AP test lies in constraining the comoving-redshift transformation to infer the appropriate cosmology which yields isotropic correlations of the galaxy density field, with the underlying assumption relying purely on the cosmological principle. Such an AP test does not rely explicitly on modelling the full statistics of the field. We verify in depth via simulations that this renders our test robust to model misspecification. This leads to another crucial advantage, namely that the cosmological parameters exhibit extremely weak dependence on the currently unresolved phenomenon of galaxy bias, thereby circumventing a potentially key limitation. This is consequently among the first methods to extract a large fraction of information from statistics other than that of direct density contrast correlations, without being sensitive to the amplitude of density fluctuations. We perform several statistical efficiency and consistency tests on a mock galaxy catalogue, using the SDSS-III survey as template.
In the past few years, approximate Bayesian Neural Networks (BNNs) have demonstrated the ability to produce statistically consistent posteriors on a wide range of inference problems at unprecedented speed and scale. However, any disconnect between tr aining sets and the distribution of real-world objects can introduce bias when BNNs are applied to data. This is a common challenge in astrophysics and cosmology, where the unknown distribution of objects in our Universe is often the science goal. In this work, we incorporate BNNs with flexible posterior parameterizations into a hierarchical inference framework that allows for the reconstruction of population hyperparameters and removes the bias introduced by the training distribution. We focus on the challenge of producing posterior PDFs for strong gravitational lens mass model parameters given Hubble Space Telescope (HST) quality single-filter, lens-subtracted, synthetic imaging data. We show that the posterior PDFs are sufficiently accurate (i.e., statistically consistent with the truth) across a wide variety of power-law elliptical lens mass distributions. We then apply our approach to test data sets whose lens parameters are drawn from distributions that are drastically different from the training set. We show that our hierarchical inference framework mitigates the bias introduced by an unrepresentative training sets interim prior. Simultaneously, given a sufficiently broad training set, we can precisely reconstruct the population hyperparameters governing our test distributions. Our full pipeline, from training to hierarchical inference on thousands of lenses, can be run in a day. The framework presented here will allow us to efficiently exploit the full constraining power of future ground- and space-based surveys.
The halo of the Milky Way provides a laboratory to study the properties of the shocked hot gas that is predicted by models of galaxy formation. There is observational evidence of energy injection into the halo from past activity in the nucleus of the Milky Way; however, the origin of this energy (star formation or supermassive-black-hole activity) is uncertain, and the causal connection between nuclear structures and large-scale features has not been established unequivocally. Here we report soft-X-ray-emitting bubbles that extend approximately 14 kiloparsecs above and below the Galactic centre and include a structure in the southern sky analogous to the North Polar Spur. The sharp boundaries of these bubbles trace collisionless and non-radiative shocks, and corroborate the idea that the bubbles are not a remnant of a local supernova but part of a vast Galaxy-scale structure closely related to features seen in gamma-rays. Large energy injections from the Galactic centre are the most likely cause of both the {gamma}-ray and X-ray bubbles. The latter have an estimated energy of around 10$^{56}$ erg, which is sufficient to perturb the structure, energy content and chemical enrichment of the circumgalactic medium of the Milky Way.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا