ترغب بنشر مسار تعليمي؟ اضغط هنا

Efficient cosmological parameter sampling using sparse grids

134   0   0.0 ( 0 )
 نشر من قبل Mona Frommert
 تاريخ النشر 2010
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

We present a novel method to significantly speed up cosmological parameter sampling. The method relies on constructing an interpolation of the CMB-log-likelihood based on sparse grids, which is used as a shortcut for the likelihood-evaluation. We obtain excellent results over a large region in parameter space, comprising about 25 log-likelihoods around the peak, and we reproduce the one-dimensional projections of the likelihood almost perfectly. In speed and accuracy, our technique is competitive to existing approaches to accelerate parameter estimation based on polynomial interpolation or neural networks, while having some advantages over them. In our method, there is no danger of creating unphysical wiggles as it can be the case for polynomial fits of a high degree. Furthermore, we do not require a long training time as for neural networks, but the construction of the interpolation is determined by the time it takes to evaluate the likelihood at the sampling points, which can be parallelised to an arbitrary degree. Our approach is completely general, and it can adaptively exploit the properties of the underlying function. We can thus apply it to any problem where an accurate interpolation of a function is needed.



قيم البحث

اقرأ أيضاً

We investigate observational constraints on cosmological parameters combining 15 measurements of the transversal BAO scale (obtained free of any fiducial cosmology) with Planck-CMB data to explore the parametric space of some cosmological models. We investigate how much Planck + transversal BAO data can constraint the minimum $Lambda$CDM model, and extensions, including neutrinos mass scale $M_{ u}$, and the possibility for a dynamical dark energy (DE) scenario. Assuming the $Lambda$CDM cosmology, we find $H_0 = 69.23 pm 0.50$ km s${}^{-1}$ Mpc${}^{-1}$, $M_{ u} < 0.11$ eV and $r_{rm drag} = 147.59 pm 0.26$ Mpc (the sound horizon at drag epoch) from Planck + transversal BAO data. When assuming a dynamical DE cosmology, we find that the inclusion of the BAO data can indeed break the degeneracy of the DE free parameters, improving the constraints on the full parameter space significantly. We note that the model is compatible with local measurements of $H_0$ and there is no tension on $H_0$ estimates in this dynamical DE context. Also, we discuss constraints and consequences from a joint analysis with the local $H_0$ measurement from SH0ES. Finally, we perform a model-independent analysis for the deceleration parameter, $q(z)$, using only information from transversal BAO data.
The ability to obtain reliable point estimates of model parameters is of crucial importance in many fields of physics. This is often a difficult task given that the observed data can have a very high number of dimensions. In order to address this pro blem, we propose a novel approach to construct parameter estimators with a quantifiable bias using an order expansion of highly compressed deep summary statistics of the observed data. These summary statistics are learned automatically using an information maximising loss. Given an observation, we further show how one can use the constructed estimators to obtain approximate Bayes computation (ABC) posterior estimates and their corresponding uncertainties that can be used for parameter inference using Gaussian process regression even if the likelihood is not tractable. We validate our method with an application to the problem of cosmological parameter inference of weak lensing mass maps. We show in that case that the constructed estimators are unbiased and have an almost optimal variance, while the posterior distribution obtained with the Gaussian process regression is close to the true posterior and performs better or equally well than comparable methods.
Determining magnetic field properties in different environments of the cosmic large-scale structure as well as their evolution over redshift is a fundamental step toward uncovering the origin of cosmic magnetic fields. Radio observations permit the s tudy of extragalactic magnetic fields via measurements of the Faraday depth of extragalactic radio sources. Our aim is to investigate how much different extragalactic environments contribute to the Faraday depth variance of these sources. We develop a Bayesian algorithm to distinguish statistically Faraday depth variance contributions intrinsic to the source from those due to the medium between the source and the observer. In our algorithm the Galactic foreground and the measurement noise are taken into account as the uncertainty correlations of the galactic model. Additionally, our algorithm allows for the investigation of possible redshift evolution of the extragalactic contribution. This work presents the derivation of the algorithm and tests performed on mock observations. With cosmic magnetism being one of the key science projects of the new generation of radio interferometers we have made predictions for the algorithms performance on data from the next generation of radio interferometers. Applications to real data are left for future work.
Periodic nonuniform sampling is a known method to sample spectrally sparse signals below the Nyquist rate. This strategy relies on the implicit assumption that the individual samplers are exposed to the entire frequency range. This assumption becomes impractical for wideband sparse signals. The current paper proposes an alternative sampling stage that does not require a full-band front end. Instead, signals are captured with an analog front end that consists of a bank of multipliers and lowpass filters whose cutoff is much lower than the Nyquist rate. The problem of recovering the original signal from the low-rate samples can be studied within the framework of compressive sampling. An appropriate parameter selection ensures that the samples uniquely determine the analog input. Moreover, the analog input can be stably reconstructed with digital algorithms. Numerical experiments support the theoretical analysis.
Survey observations of the three-dimensional locations of galaxies are a powerful approach to measure the distribution of matter in the universe, which can be used to learn about the nature of dark energy, physics of inflation, neutrino masses, etc. A competitive survey, however, requires a large volume (e.g., Vsurvey is roughly 10 Gpc3) to be covered, and thus tends to be expensive. A sparse sampling method offers a more affordable solution to this problem: within a survey footprint covering a given survey volume, Vsurvey, we observe only a fraction of the volume. The distribution of observed regions should be chosen such that their separation is smaller than the length scale corresponding to the wavenumber of interest. Then one can recover the power spectrum of galaxies with precision expected for a survey covering a volume of Vsurvey (rather than the volume of the sum of observed regions) with the number density of galaxies given by the total number of observed galaxies divided by Vsurvey (rather than the number density of galaxies within an observed region). We find that regularly-spaced sampling yields an unbiased power spectrum with no window function effect, and deviations from regularly-spaced sampling, which are unavoidable in realistic surveys, introduce calculable window function effects and increase the uncertainties of the recovered power spectrum. While we discuss the sparse sampling method within the context of the forthcoming Hobby-Eberly Telescope Dark Energy Experiment, the method is general and can be applied to other galaxy surveys.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا