No Arabic abstract
The counting of pairs of galaxies or stars according to their distance is at the core of all real-space correlation analyzes performed in astrophysics and cosmology. The next stage upcoming ground (LSST) and space (Euclid) surveys will measure properties of billions of galaxies and tomographic shells will contain hundreds of millions of objects. The combinatorics of the pair count challenges our ability to perform such counting in a minute-scale time which is the order of magnitude useful for optimizing analyses through the intensive use of simulations. The problem is not CPU intensive and is only limited by an efficient access to the data, hence it belongs to the big data category. We use the popular Apache Spark framework to address it and design an efficient high-throughput algorithm to deal with hundreds of millions to billions of input data. To optimize it, we revisit the question of nonhierarchical sphere pixelization based on cube symmetries and develop a new one that we call the Similar Radius Sphere Pixelization (SARSPix) with square-like pixels. It provides the most adapted sphere packing for all distance-related computations. Using LSST-like fast simulations, we compute autocorrelation functions on tomographic bins containing between a hundred million to one billion data points. In all cases we achieve the full construction of a classical pair-distance histogram in about 2 minutes, using a moderate number of worker nodes (16 to 64). This is typically two orders of magnitude higher than what is achieved today and shows the potential of using these new techniques in the field of astronomy on ever-growing datasets. The method presented here is flexible enough to be adapted to any medium size cluster and the software is publicly available from https://github.com/LSSTDESC/SparkCorr.
In the next decade, new ground-based Cosmic Microwave Background (CMB) experiments such as Simons Observatory (SO), CCAT-prime, and CMB-S4 will increase the number of detectors observing the CMB by an order of magnitude or more, dramatically improving our understanding of cosmology and astrophysics. These projects will deploy receivers with as many as hundreds of thousands of transition edge sensor (TES) bolometers coupled to Superconducting Quantum Interference Device (SQUID)-based readout systems. It is well known that superconducting devices such as TESes and SQUIDs are sensitive to magnetic fields. However, the effects of magnetic fields on TESes are not easily predicted due to the complex behavior of the superconducting transition, which motivates direct measurements of the magnetic sensitivity of these devices. We present comparative four-lead measurements of the critical temperature versus applied magnetic field of AlMn TESes varying in geometry, doping, and leg length, including Advanced ACT (AdvACT) and POLARBEAR-2/Simons Array bolometers. Molybdenum-copper bilayer ACTPol TESes are also tested and are found to be more sensitive to magnetic fields than the AlMn devices. We present an observation of weak-link-like behavior in AlMn TESes at low critical currents. We also compare measurements of magnetic sensitivity for time division multiplexing SQUIDs and frequency division multiplexing microwave rf-SQUIDs. We discuss the implications of our measurements on the magnetic shielding required for future experiments that aim to map the CMB to near-fundamental limits.
Various observational techniques have been used to survey galaxies and AGN, from X-rays to radio frequencies, both photometric and spectroscopic. I will review these techniques aimed at the study of galaxy evolution and of the role of AGNs and star formation as the two main energy production mechanisms. I will then present as a new observational approach the far-IR spectroscopic surveys that could be done with planned astronomical facilities of the next future, such as SPICA from the space and CCAT from the ground.
Wide-angle surveys have been an engine for new discoveries throughout the modern history of astronomy, and have been among the most highly cited and scientifically productive observing facilities in recent years. This trend is likely to continue over the next decade, as many of the most important questions in astrophysics are best tackled with massive surveys, often in synergy with each other and in tandem with the more traditional observatories. We argue that these surveys are most productive and have the greatest impact when the data from the surveys are made public in a timely manner. The rise of the survey astronomer is a substantial change in the demographics of our field; one of the most important challenges of the next decade is to find ways to recognize the intellectual contributions of those who work on the infrastructure of surveys (hardware, software, survey planning and operations, and databases/data distribution), and to make career paths to allow them to thrive.
For nearly a century, imaging and spectroscopic surveys of galaxies have given us information about the contents of the universe. We attempt to define the logical endpoint of such surveys by defining not the next galaxy survey, but the final galaxy survey at NIR wavelengths; this would be the galaxy survey that exhausts the information content useful for addressing extant questions. Such a survey would require incredible advances in a number of technologies and the survey details will depend on the as yet poorly constrained properties of the earliest galaxies. Using an exposure time calculator, we define nominal surveys for extracting the useful information for three science cases: dark energy cosmology, galaxy evolution, and supernovae. We define scaling relations that trade off sky background, telescope aperture, and focal plane size to allow for a survey of a given depth over a given area. For optimistic assumptions, a 280m telescope with a marginally resolved focal plane of 20 deg$^2$ operating at L2 could potentially exhaust the cosmological information content of galaxies in a 10 year survey. For galaxy evolution (making use of gravitational lensing to magnify the earliest galaxies) and SN, the same telescope would suffice. We discuss the technological advances needed to complete the last galaxy survey. While the final galaxy survey remains well outside of our technical reach today, we present scaling relations that show how we can progress toward the goal of exhausting the information content encoded in the shapes, positions, and colors of galaxies.
The nature of dark matter, dark energy and large-scale gravity pose some of the most pressing questions in cosmology today. These fundamental questions require highly precise measurements, and a number of wide-field spectroscopic survey instruments are being designed to meet this requirement. A key component in these experiments is the development of a simulation tool to forecast science performance, define requirement flow-downs, optimize implementation, demonstrate feasibility, and prepare for exploitation. We present SPOKES (SPectrOscopic KEn Simulation), an end-to-end simulation facility for spectroscopic cosmological surveys designed to address this challenge. SPOKES is based on an integrated infrastructure, modular function organization, coherent data handling and fast data access. These key features allow reproducibility of pipeline runs, enable ease of use and provide flexibility to update functions within the pipeline. The cyclic nature of the pipeline offers the possibility to make the science output an efficient measure for design optimization and feasibility testing. We present the architecture, first science, and computational performance results of the simulation pipeline. The framework is general, but for the benchmark tests, we use the Dark Energy Spectrometer (DESpec), one of the early concepts for the upcoming project, the Dark Energy Spectroscopic Instrument (DESI). We discuss how the SPOKES framework enables a rigorous process to optimize and exploit spectroscopic survey experiments in order to derive high-precision cosmological measurements optimally.