ترغب بنشر مسار تعليمي؟ اضغط هنا

deep21: a Deep Learning Method for 21cm Foreground Removal

91   0   0.0 ( 0 )
 نشر من قبل T. Lucas Makinen
 تاريخ النشر 2020
والبحث باللغة English




اسأل ChatGPT حول البحث

We seek to remove foreground contaminants from 21cm intensity mapping observations. We demonstrate that a deep convolutional neural network (CNN) with a UNet architecture and three-dimensional convolutions, trained on simulated observations, can effectively separate frequency and spatial patterns of the cosmic neutral hydrogen (HI) signal from foregrounds in the presence of noise. Cleaned maps recover cosmological clustering statistics within 10% at all relevant angular scales and frequencies. This amounts to a reduction in prediction variance of over an order of magnitude on small angular scales ($ell > 300$), and improved accuracy for small radial scales ($k_{parallel} > 0.17 rm h Mpc^{-1})$ compared to standard Principal Component Analysis (PCA) methods. We estimate posterior confidence intervals for the networks prediction by training an ensemble of UNets. Our approach demonstrates the feasibility of analyzing 21cm intensity maps, as opposed to derived summary statistics, for upcoming radio experiments, as long as the simulated foreground model is sufficiently realistic. We provide the code used for this analysis on Github https://github.com/tlmakinen/deep21 as well as a browser-based tutorial for the experiment and UNet model via the accompanying http://bit.ly/deep21-colab Colab notebook.



قيم البحث

اقرأ أيضاً

We compare various foreground removal techniques that are being utilised to remove bright foregrounds in various experiments aiming to detect the redshifted 21cm signal of neutral hydrogen from the Epoch of Reionization. In this work, we test the per formance of removal techniques (FastICA, GMCA, and GPR) on 10 nights of LOFAR data and investigate the possibility of recovering the latest upper limit on the 21cm signal. Interestingly, we find that GMCA and FastICA reproduce the most recent 2$sigma$ upper limit of $Delta^2_{21} <$ (73)$^2$ mK$^2$ at $k=0.075~ h mathrm{cMpc}^{-1}$, which resulted from the application of GPR. We also find that FastICA and GMCA begin to deviate from the noise-limit at textit{k}-scales larger than $sim 0.1 ~h mathrm{cMpc}^{-1}$. We then replicate the data via simulations to see the source of FastICA and GMCAs limitations, by testing them against various instrumental effects. We find that no single instrumental effect, such as primary beam effects or mode-mixing, can explain the poorer recovery by FastICA and GMCA at larger textit{k}-scales. We then test scale-independence of FastICA and GMCA, and find that lower textit{k}-scales can be modelled by a smaller number of independent components. For larger scales ($k gtrsim 0.1~h mathrm{cMpc}^{-1}$), more independent components are needed to fit the foregrounds. We conclude that, the current usage of GPR by the LOFAR collaboration is the appropriate removal technique. It is both robust and less prone to overfitting, with future improvements to GPRs fitting optimisation to yield deeper limits.
126 - Geraint Harker 2009
An obstacle to the detection of redshifted 21cm emission from the epoch of reionization (EoR) is the presence of foregrounds which exceed the cosmological signal in intensity by orders of magnitude. We argue that in principle it would be better to fi t the foregrounds non-parametrically - allowing the data to determine their shape - rather than selecting some functional form in advance and then fitting its parameters. Non-parametric fits often suffer from other problems, however. We discuss these before suggesting a non-parametric method, Wp smoothing, which seems to avoid some of them. After outlining the principles of Wp smoothing we describe an algorithm used to implement it. We then apply Wp smoothing to a synthetic data cube for the LOFAR EoR experiment. The performance of Wp smoothing, measured by the extent to which it is able to recover the variance of the cosmological signal and to which it avoids leakage of power from the foregrounds, is compared to that of a parametric fit, and to another non-parametric method (smoothing splines). We find that Wp smoothing is superior to smoothing splines for our application, and is competitive with parametric methods even though in the latter case we may choose the functional form of the fit with advance knowledge of the simulated foregrounds. Finally, we discuss how the quality of the fit is affected by the frequency resolution and range, by the characteristics of the cosmological signal and by edge effects.
Next-generation 21cm observations will enable imaging of reionization on very large scales. These images will contain more astrophysical and cosmological information than the power spectrum, and hence providing an alternative way to constrain the con tribution of different reionizing sources populations to cosmic reionization. Using Convolutional Neural Networks, we present a simple network architecture that is sufficient to discriminate between Galaxy-dominated versus AGN-dominated models, even in the presence of simulated noise from different experiments such as the HERA and SKA.
We test for foreground residuals in the foreground cleaned Planck Cosmic Microwave Background (CMB) maps outside and inside U73 mask commonly used for cosmological analysis. The aim of this paper is to introduce a new method to validate masks by look ing at the differences in cleaned maps obtained by different component separation methods. By analyzing the power spectrum as well as the mean, variance and skewness of needlet coefficients on bands outside and inside the U73 mask we first confirm that the pixels already masked by U73 are highly contaminated and cannot be used for cosmological analysis. We further find that the U73 mask needs extension in order to reduce large scale foreground residuals to a level of less than $20%$ of the standard deviation of CMB fluctuations within the bands closest to the galactic equator. We also find 276 point sources in the cleaned foreground maps which are currently not masked by the U73 mask. Our final publicly available extended mask leaves $65.9%$ of the sky for cosmological analysis. Note that this extended mask may be important for analyses on local sky patches; in full sky analyses the additional residuals near the galactic equator may average out.
We present new observations with the Precision Array for Probing the Epoch of Reionization (PAPER) with the aim of measuring the properties of foreground emission for 21cm Epoch of Reionization experiments at 150 MHz. We focus on the footprint of the foregrounds in cosmological Fourier space to understand which modes of the 21cm power spectrum will most likely be compromised by foreground emission. These observations confirm predictions that foregrounds can be isolated to a wedge-like region of 2D (k-perpendicular, k-parallel)-space, creating a window for cosmological studies at higher k-parallel values. We also find that the emission extends past the nominal edge of this wedge due to spectral structure in the foregrounds, with this feature most prominent on the shortest baselines. Finally, we filter the data to retain only this unsmooth emission and image specific k-parallel modes of it. The resultant images show an excess of power at the lowest modes, but no emission can be clearly localized to any one region of the sky. This image is highly suggestive that the most problematic foregrounds for 21cm EoR studies will not be easily identifiable bright sources, but rather an aggregate of fainter emission.

الأسئلة المقترحة

التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا