ترغب بنشر مسار تعليمي؟ اضغط هنا

We report on the MIT Epoch of Reionization (MITEoR) experiment, a pathfinder low-frequency radio interferometer whose goal is to test technologies that improve the calibration precision and reduce the cost of the high-sensitivity 3D mapping required for 21 cm cosmology. MITEoR accomplishes this by using massive baseline redundancy, which enables both automated precision calibration and correlator cost reduction. We demonstrate and quantify the power and robustness of redundancy for scalability and precision. We find that the calibration parameters precisely describe the effect of the instrument upon our measurements, allowing us to form a model that is consistent with $chi^2$ per degree of freedom < 1.2 for as much as 80% of the observations. We use these results to develop an optimal estimator of calibration parameters using Wiener filtering, and explore the question of how often and how finely in frequency visibilities must be reliably measured to solve for calibration coefficients. The success of MITEoR with its 64 dual-polarization elements bodes well for the more ambitious Hydrogen Epoch of Reionization Array (HERA) project and other next-generation instruments, which would incorporate many identical or similar technologies.
Mapping our universe in 3D by imaging the redshifted 21 cm line from neutral hydrogen has the potential to overtake the cosmic microwave background as our most powerful cosmological probe, because it can map a much larger volume of our Universe, shed ding new light on the epoch of reionization, inflation, dark matter, dark energy, and neutrino masses. We report on MITEoR, a pathfinder low-frequency radio interferometer whose goal is to test technologies that greatly reduce the cost of such 3D mapping for a given sensitivity. MITEoR accomplishes this by using massive baseline redundancy both to enable automated precision calibration and to cut the correlator cost scaling from N^2 to NlogN, where N is the number of antennas. The success of MITEoR with its 64 dual-polarization elements bodes well for the more ambitious HERA project, which would incorporate many identical or similar technologies using an order of magnitude more antennas, each with dramatically larger collecting area.
We survey observational constraints on the parameter space of inflation and axions and map out two allowed windows: the classic window and the inflationary anthropic window. The cosmology of the latter is particularly interesting; inflationary axion cosmology predicts the existence of isocurvature fluctuations in the CMB, with an amplitude that grows with both the energy scale of inflation and the fraction of dark matter in axions. Statistical arguments favor a substantial value for the latter, and so current bounds on isocurvature fluctuations imply tight constraints on inflation. For example, an axion Peccei-Quinn scale of 10^16 GeV excludes any inflation model with energy scale > 3.8*10^14 GeV (r > 2*10^(-9)) at 95% confidence, and so implies negligible gravitational waves from inflation, but suggests appreciable isocurvature fluctuations.
37 - Flora Lopis 2009
Was Einstein wrong? This paper provides a detailed technical review of Einsteins special and general relativity from an astrophysical perspective, including the historical development of the theories, experimental tests, modern applications to black holes, cosmology and parallel universes, and last but not least, novel ways of expressing their seven most important equations.
122 - Max Tegmark 2009
We propose an all-digital telescope for 21 cm tomography, which combines key advantages of both single dishes and interferometers. The electric field is digitized by antennas on a rectangular grid, after which a series of Fast Fourier Transforms reco vers simultaneous multifrequency images of up to half the sky. Thanks to Moores law, the bandwidth up to which this is feasible has now reached about 1 GHz, and will likely continue doubling every couple of years. The main advantages over a single dish telescope are cost and orders of magnitude larger field-of-view, translating into dramatically better sensitivity for large-area surveys. The key advantages over traditional interferometers are cost (the correlator computational cost for an N-element array scales as N log N rather than N^2) and a compact synthesized beam. We argue that 21 cm tomography could be an ideal first application of a very large Fast Fourier Transform Telescope, which would provide both massive sensitivity improvements per dollar and mitigate the off-beam point source foreground problem with its clean beam. Another potentially interesting application is cosmic microwave background polarization.
47 - Adrian Liu 2009
21 cm tomography is emerging as a promising probe of the cosmological dark ages and the epoch of reionization, as well as a tool for observational cosmology in general. However, serious sources of foreground contamination must be subtracted for exper imental efforts to be viable. In this paper, we focus on the removal of unresolved extragalactic point sources with smooth spectra, and evaluate how the residual foreground contamination after cleaning depends on instrumental and algorithmic parameters. A crucial but often ignored complication is that the synthesized beam of an interferometer array shrinks towards higher frequency, causing complicated frequency structure in each sky pixel as frizz far from the beam center contracts across unresolved radio sources. We find that current-generation experiments should nonetheless be able to clean out this points source contamination adequately, and quantify the instrumental and algorithmic design specifications required to meet this foreground challenge.
As galaxy surveys become larger and more complex, keeping track of the completeness, magnitude limit, and other survey parameters as a function of direction on the sky becomes an increasingly challenging computational task. For example, typical angul ar masks of the Sloan Digital Sky Survey contain about N=300,000 distinct spherical polygons. Managing masks with such large numbers of polygons becomes intractably slow, particularly for tasks that run in time O(N^2) with a naive algorithm, such as finding which polygons overlap each other. Here we present a divide-and-conquer solution to this challenge: we first split the angular mask into predefined regions called pixels, such that each polygon is in only one pixel, and then perform further computations, such as checking for overlap, on the polygons within each pixel separately. This reduces O(N^2) tasks to O(N), and also reduces the important task of determining in which polygon(s) a point on the sky lies from O(N) to O(1), resulting in significant computational speedup. Additionally, we present a method to efficiently convert any angular mask to and from the popular HEALPix format. This method can be generically applied to convert to and from any desired spherical pixelization. We have implemented these techniques in a new version of the mangle software package, which is freely available at http://space.mit.edu/home/tegmark/mangle/, along with complete documentation and example applications. These new methods should prove quite useful to the astronomical community, and since mangle is a generic tool for managing angular masks on a sphere, it has the potential to benefit terrestrial mapmaking applications as well.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا