ترغب بنشر مسار تعليمي؟ اضغط هنا

What Can We Learn by Combining the Skew Spectrum and the Power Spectrum?

118   0   0.0 ( 0 )
 نشر من قبل Ji-Ping Dai
 تاريخ النشر 2020
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Clustering of the large scale structure provides complementary information to the measurements of the cosmic microwave background anisotropies through power spectrum and bispectrum of density perturbations. Extracting the bispectrum information, however, is more challenging than it is from the power spectrum due to the complex models and the computational cost to measure the signal and its covariance. To overcome these problems, we adopt a proxy statistic, skew spectrum which is a cross-spectrum of the density field and its quadratic field. By applying a large smoothing filter to the density field, we show the theory fits the simulations very well. With the spectra and their full covariance estimated from $N$-body simulations as our mock Universe, we perform a global fits for the cosmological parameters. The results show that adding skew spectrum to power spectrum the $1sigma$ marginalized errors for parameters $ b_1^2A_s, n_s$ and $f_{rm NL}^{rm loc}$ are reduced by $31%, 22%, 44%$, respectively. This is the answer to the question posed in the title and indicates that the skew spectrum will be a fast and effective method to access complementary information to that enclosed in the power spectrum measurements, especially for the forthcoming generation of wide-field galaxy surveys.



قيم البحث

اقرأ أيضاً

Learning problems form an important category of computational tasks that generalizes many of the computations researchers apply to large real-life data sets. We ask: what concept classes can be learned privately, namely, by an algorithm whose output does not depend too heavily on any one input or specific training example? More precisely, we investigate learning algorithms that satisfy differential privacy, a notion that provides strong confidentiality guarantees in contexts where aggregate information is released about a database containing sensitive information about individuals. We demonstrate that, ignoring computational constraints, it is possible to privately agnostically learn any concept class using a sample size approximately logarithmic in the cardinality of the concept class. Therefore, almost anything learnable is learnable privately: specifically, if a concept class is learnable by a (non-private) algorithm with polynomial sample complexity and output size, then it can be learned privately using a polynomial number of samples. We also present a computationally efficient private PAC learner for the class of parity functions. Local (or randomized response) algorithms are a practical class of private algorithms that have received extensive investigation. We provide a precise characterization of local private learning algorithms. We show that a concept class is learnable by a local algorithm if and only if it is learnable in the statistical query (SQ) model. Finally, we present a separation between the power of interactive and noninteractive local learning algorithms.
We investigate a potential of the higher multipole power spectra of the galaxy distribution in redshift space as a cosmological probe on halo scales. Based on the fact that a halo model explains well the multipole power spectra of the luminous red ga laxy (LRG) sample in the Sloan Digital Sky Survey (SDSS), we focus our investigation on the random motions of the satellite LRGs that determine the higher multipole spectra at large wavenumbers. We show that our theoretical model fits the higher multipole spectra at large wave numbers from N-body numerical simulations and we apply these results for testing the gravity theory and the velocity structure of galaxies on the halo scales. In this analysis, we use the multipole spectra P_4(k) and P_6(k) on the small scales of the range of wavenumber 0.3<k/[h{Mpc}^{-1}]<0.6, which is in contrast to the usual method of testing gravity by targeting the linear growth rate on very large scales. We demonstrate that our method could be useful for testing gravity on the halo scales.
We investigate the effects of multi-task learning using the recently introduced task of semantic tagging. We employ semantic tagging as an auxiliary task for three different NLP tasks: part-of-speech tagging, Universal Dependency parsing, and Natural Language Inference. We compare full neural network sharing, partial neural network sharing, and what we term the learning what to share setting where negative transfer between tasks is less likely. Our findings show considerable improvements for all tasks, particularly in the learning what to share setting, which shows consistent gains across all tasks.
We discuss the features of instabilities in binary systems, in particular, for asymmetric nuclear matter. We show its relevance for the interpretation of results obtained in experiments and in ab initio simulations of the reaction between $^{124}Sn+^{124}Sn$ at 50AMeV.}
A number of experiments are currently working towards a measurement of the 21 cm signal from the Epoch of Reionization. Whether or not these experiments deliver a detection of cosmological emission, their limited sensitivity will prevent them from pr oviding detailed information about the astrophysics of reionization. In this work, we consider what types of measurements will be enabled by a next-generation of larger 21 cm EoR telescopes. To calculate the type of constraints that will be possible with such arrays, we use simple models for the instrument, foreground emission, and the reionization history. We focus primarily on an instrument modeled after the $sim 0.1~rm{km}^2$ collecting area Hydrogen Epoch of Reionization Array (HERA) concept design, and parameterize the uncertainties with regard to foreground emission by considering different limits to the recently described wedge footprint in k-space. Uncertainties in the reionization history are accounted for using a series of simulations which vary the ionizing efficiency and minimum virial temperature of the galaxies responsible for reionization, as well as the mean free path of ionizing photons through the IGM. Given various combinations of models, we consider the significance of the possible power spectrum detections, the ability to trace the power spectrum evolution versus redshift, the detectability of salient power spectrum features, and the achievable level of quantitative constraints on astrophysical parameters. Ultimately, we find that $0.1~rm{km}^2$ of collecting area is enough to ensure a very high significance ($gtrsim30sigma$) detection of the reionization power spectrum in even the most pessimistic scenarios. This sensitivity should allow for meaningful constraints on the reionization history and astrophysical parameters, especially if foreground subtraction techniques can be improved and successfully implemented.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا