ترغب بنشر مسار تعليمي؟ اضغط هنا

Methods for Rapidly Processing Angular Masks of Next-Generation Galaxy Surveys

204   0   0.0 ( 0 )
 نشر من قبل Molly Swanson
 تاريخ النشر 2008
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

As galaxy surveys become larger and more complex, keeping track of the completeness, magnitude limit, and other survey parameters as a function of direction on the sky becomes an increasingly challenging computational task. For example, typical angular masks of the Sloan Digital Sky Survey contain about N=300,000 distinct spherical polygons. Managing masks with such large numbers of polygons becomes intractably slow, particularly for tasks that run in time O(N^2) with a naive algorithm, such as finding which polygons overlap each other. Here we present a divide-and-conquer solution to this challenge: we first split the angular mask into predefined regions called pixels, such that each polygon is in only one pixel, and then perform further computations, such as checking for overlap, on the polygons within each pixel separately. This reduces O(N^2) tasks to O(N), and also reduces the important task of determining in which polygon(s) a point on the sky lies from O(N) to O(1), resulting in significant computational speedup. Additionally, we present a method to efficiently convert any angular mask to and from the popular HEALPix format. This method can be generically applied to convert to and from any desired spherical pixelization. We have implemented these techniques in a new version of the mangle software package, which is freely available at http://space.mit.edu/home/tegmark/mangle/, along with complete documentation and example applications. These new methods should prove quite useful to the astronomical community, and since mangle is a generic tool for managing angular masks on a sphere, it has the potential to benefit terrestrial mapmaking applications as well.

قيم البحث

اقرأ أيضاً

CMB surveys provide, for free, blindly selected samples of extragalactic radio sources at much higher frequencies than traditional radio surveys. Next-generation, ground-based CMB experiments with arcmin resolution at mm wavelengths will provide samp les of thousands radio sources allowing the investigation of the evolutionary properties of blazar populations, the study of the earliest and latest stages of radio activity, the discovery of rare phenomena and of new transient sources and events. Space-borne experiments will extend to sub-mm wavelengths the determinations of the SEDs of many hundreds of blazars, in temperature and in polarization, allowing us to investigate the flow and the structure of relativistic jets close to their base, and the electron acceleration mechanisms. A real breakthrough will be achieved in the caracterization of the polarization properties. The first direct counts in polarization will be obtained, enabling a solid assessment of the extra-galactic source contamination of CMB maps and allowing us to understand structure and intensity of magnetic fields, particle densities and structures of emitting regions close to the base of the jet.
We address key points for an efficient implementation of likelihood codes for modern weak lensing large-scale structure surveys. Specifically, we focus on the joint weak lensing convergence power spectrum-bispectrum probe and we tackle the numerical challenges required by a realistic analysis. Under the assumption of (multivariate) Gaussian likelihoods, we have developed a high performance code that allows highly parallelised prediction of the binned tomographic observables and of their joint non-Gaussian covariance matrix accounting for terms up to the 6-point correlation function and super-sample effects. This performance allows us to qualitatively address several interesting scientific questions. We find that the bispectrum provides an improvement in terms of signal-to-noise ratio (S/N) of about 10% on top of the power spectrum, making it a non-negligible source of information for future surveys. Furthermore, we are capable to test the impact of theoretical uncertainties in the halo model used to build our observables; with presently allowed variations we conclude that the impact is negligible on the S/N. Finally, we consider data compression possibilities to optimise future analyses of the weak lensing bispectrum. We find that, ignoring systematics, 5 equipopulated redshift bins are enough to recover the information content of a Euclid-like survey, with negligible improvement when increasing to 10 bins. We also explore principal component analysis and dependence on the triangle shapes as ways to reduce the numerical complexity of the problem.
261 - Fabien Lacasa 2019
As galaxy surveys become more precise and push to smaller scales, the need for accurate covariances beyond the classical Gaussian formula becomes more acute. Here, I investigate the analytical implementation and impact of non-Gaussian covariance term s that I previously derived for galaxy clustering. Braiding covariance is such a class of terms and it gets contribution both from in-survey and super-survey modes. I present an approximation for braiding covariance which speeds up the numerical computation. I show that including braiding covariance is a necessary condition for including other non-Gaussian terms: the in-survey 2-, 3- and 4-halo covariance, which yield covariance matrices with negative eigenvalues if considered on their own. I then quantify the impact on parameter constraints, with forecasts for a Euclid-like survey. Compared to the Gaussian case, braiding and in-survey covariances significantly increase the error bars on cosmological parameters, in particular by 50% for w. The Halo Occupation Distribution (HOD) error bars are also affected between 12% and 39%. Accounting for super-sample covariance (SSC) also increases parameter errors, by 90% for w and between 7% and 64% for HOD. In total, non-Gaussianity increases the error bar on w by 120% (between 15% and 80% for other cosmological parameters), and the error bars on HOD parameters between 17% and 85%. Accounting for the 1-halo trispectrum term on top of SSC is not sufficient for capturing the full non-Gaussian impact: braiding and the rest of in-survey covariance have to be accounted for. Finally, I discuss why the inclusion of non-Gaussianity generally eases up parameter degeneracies, making cosmological constraints more robust to astrophysical uncertainties. The data and a Python notebook reproducing the results and plots of the article are available at url{https://github.com/fabienlacasa/BraidingArticle}. [Abridged]
The counting of pairs of galaxies or stars according to their distance is at the core of all real-space correlation analyzes performed in astrophysics and cosmology. The next stage upcoming ground (LSST) and space (Euclid) surveys will measure proper ties of billions of galaxies and tomographic shells will contain hundreds of millions of objects. The combinatorics of the pair count challenges our ability to perform such counting in a minute-scale time which is the order of magnitude useful for optimizing analyses through the intensive use of simulations. The problem is not CPU intensive and is only limited by an efficient access to the data, hence it belongs to the big data category. We use the popular Apache Spark framework to address it and design an efficient high-throughput algorithm to deal with hundreds of millions to billions of input data. To optimize it, we revisit the question of nonhierarchical sphere pixelization based on cube symmetries and develop a new one that we call the Similar Radius Sphere Pixelization (SARSPix) with square-like pixels. It provides the most adapted sphere packing for all distance-related computations. Using LSST-like fast simulations, we compute autocorrelation functions on tomographic bins containing between a hundred million to one billion data points. In all cases we achieve the full construction of a classical pair-distance histogram in about 2 minutes, using a moderate number of worker nodes (16 to 64). This is typically two orders of magnitude higher than what is achieved today and shows the potential of using these new techniques in the field of astronomy on ever-growing datasets. The method presented here is flexible enough to be adapted to any medium size cluster and the software is publicly available from https://github.com/LSSTDESC/SparkCorr.
The coming decade will be an exciting period for dark energy research, during which astronomers will address the question of what drives the accelerated cosmic expansion as first revealed by type Ia supernova (SN) distances, and confirmed by later ob servations. The mystery of dark energy poses a challenge of such magnitude that, as stated by the Dark Energy Task Force (DETF), nothing short of a revolution in our understanding of fundamental physics will be required to achieve a full understanding of the cosmic acceleration. The lack of multiple complementary precision observations is a major obstacle in developing lines of attack for dark energy theory. This lack is precisely what next-generation surveys will address via the powerful techniques of weak lensing (WL) and baryon acoustic oscillations (BAO) -- galaxy correlations more generally -- in addition to SNe, cluster counts, and other probes of geometry and growth of structure. Because of their unprecedented statistical power, these surveys demand an accurate understanding of the observables and tight control of systematics. This white paper highlights the opportunities, approaches, prospects, and challenges relevant to dark energy studies with wide-deep multiwavelength photometric redshift surveys. Quantitative predictions are presented for a 20000 sq. deg. ground-based 6-band (ugrizy) survey with 5-sigma depth of r~27.5, i.e., a Stage 4 survey as defined by the DETF.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا