ترغب بنشر مسار تعليمي؟ اضغط هنا

Supernovae classes have been defined phenomenologically, based on spectral features and time series data, since the specific details of the physics of the different explosions remain unrevealed. However, the number of these classes is increasing as o bjects with new features are observed, and the next generation of large-surveys will only bring more variety to our attention. We apply the machine learning technique of multi-label classification to the spectra of supernovae. By measuring the probabilities of specific features or `tags in the supernova spectra, we can compress the information from a specific object down to that suitable for a human or database scan, without the need to directly assign to a reductive `class. We use logistic regression to assign tag probabilities, and then a feed-forward neural network to filter the objects into the standard set of classes, based solely on the tag probabilities. We present STag, a software package that can compute these tag probabilities and make spectral classifications.
A violation of the distance-duality relation is directly linked with a temporal variation of the electromagnetic fine-structure constant. We consider a number of well-studied $f(T)$ gravity models and we revise the theoretical prediction of their cor responding induced violation of the distance-duality relationship. We further extract constraints on the involved model parameters through fine-structure constant variation data, alongside with supernovae data, and Hubble parameter measurements. Moreover, we constrain the evolution of the effective $f(T)$ gravitational constant. Finally, we compare with revised constraints on the phenomenological parametrisations of the violation of the equivalence principle in the electromagnetic sector.
The distribution of cosmological neutral hydrogen will provide a new window into the large-scale structure of the Universe with the next generation of radio telescopes and surveys. The observation of this material, through 21cm line emission, will be confused by foreground emission in the same frequencies. Even after these foregrounds are removed, the reconstructed map may not exactly match the original cosmological signal, which will introduce systematic errors and offset into the measured correlations. In this paper, we simulate future surveys of neutral hydrogen using the Horizon Run 4 (HR4) cosmological N-body simulation. We generate HI intensity maps from the HR4 halo catalogue, and combine with foreground radio emission maps from the Global Sky Model, to create accurate simulations over the entire sky. We simulate the HI sky for the frequency range 700-800 MHz, matching the sensitivity of the Tianlai pathfinder. We test the accuracy of the fastICA, PCA and log-polynomial fitting foreground removal methods to recover the input cosmological angular power spectrum and measure the parameters. We show the effect of survey noise levels and beam sizes on the recovered the cosmological constraints. We find that while the reconstruction removes power from the cosmological 21cm distribution on large-scales, we can correct for this and recover the input parameters in the noise-free case. However, the effect of noise and beam size of the Tianlai pathfinder prevents accurate recovery of the cosmological parameters when using only intensity mapping information.
The volume of data that will be produced by the next generation of astrophysical instruments represents a significant opportunity for making unplanned and unexpected discoveries. Conversely, finding unexpected objects or phenomena within such large v olumes of data presents a challenge that may best be solved using computational and statistical approaches. We present the application of a coarse-grained complexity measure for identifying interesting observations in large astronomical data sets. This measure, which has been termed apparent complexity, has been shown to model human intuition and perceptions of complexity. Apparent complexity is computationally efficient to derive and can be used to segment and identify interesting observations in very large data sets based on their morphological complexity. We show, using data from the Australia Telescope Large Area Survey, that apparent complexity can be combined with clustering methods to provide an automated process for distinguishing between images of galaxies which have been classified as having simple and complex morphologies. The approach generalizes well when applied to new data after being calibrated on a smaller data set, where it performs better than tested classification methods using pixel data. This generalizability positions apparent complexity as a suitable machine-learning feature for identifying complex observations with unanticipated features.
We present the strongest current cosmological upper limit on the sum of neutrino masses of < 0.18 (95% confidence). It is obtained by adding observations of the large-scale matter power spectrum from the WiggleZ Dark Energy Survey to observations of the cosmic microwave background data from the Planck surveyor, and measurements of the baryon acoustic oscillation scale. The limit is highly sensitive to the priors and assumptions about the neutrino scenario. We explore scenarios with neutrino masses close to the upper limit (degenerate masses), neutrino masses close to the lower limit where the hierarchy plays a role, and addition of massive or massless sterile species.
Neutrinos are one of the major puzzles in modern physics. Despite measurements of mass differences, the Standard Model of particle physics describes them as exactly massless. Additionally, recent measurements from both particle physics experiments an d cosmology indicate the existence of more than the three Standard Model species. Here we review the cosmological evidence and its possible interpretations.
Dark energy may be the first sign of new fundamental physics in the Universe, taking either a physical form or revealing a correction to Einsteinian gravity. Weak gravitational lensing and galaxy peculiar velocities provide complementary probes of Ge neral Relativity, and in combination allow us to test modified theories of gravity in a unique way. We perform such an analysis by combining measurements of cosmic shear tomography from the Canada-France Hawaii Telescope Lensing Survey (CFHTLenS) with the growth of structure from the WiggleZ Dark Energy Survey and the Six-degree-Field Galaxy Survey (6dFGS), producing the strongest existing joint constraints on the metric potentials that describe general theories of gravity. For scale-independent modifications to the metric potentials which evolve linearly with the effective dark energy density, we find present-day cosmological deviations in the Newtonian potential and curvature potential from the prediction of General Relativity to be (Delta Psi)/Psi = 0.05 pm 0.25 and (Delta Phi)/Phi = -0.05 pm 0.3 respectively (68 per cent CL).
We place the most robust constraint to date on the scale of the turnover in the cosmological matter power spectrum using data from the WiggleZ Dark Energy Survey. We find this feature to lie at a scale of $k_0=0.0160^{+0.0041}_{-0.0035}$ [h/Mpc] (68% confidence) for an effective redshift of 0.62 and obtain from this the first-ever turnover-derived distance and cosmology constraints: a measure of the cosmic distance-redshift relation in units of the horizon scale at the redshift of radiation-matter equality (r_H) of D_V(z=0.62)/r_H=18.3 (+6.3/-3.3) and, assuming a prior on the number of extra relativistic degrees of freedom $N_{eff}=3$, constraints on the matter density parameter $Omega_Mh^2=0.136^{+0.026}_{-0.052}$ and on the redshift of matter-radiation equality $z_{eq}=3274^{+631}_{-1260}$. All results are in excellent agreement with the predictions of standard LCDM models. Our constraints on the logarithmic slope of the power spectrum on scales larger than the turnover is bounded in the lower limit with values only as low as -1 allowed, with the prediction of standard LCDM models easily accommodated by our results. Lastly, we generate forecasts for the achievable precision of future surveys at constraining $k_0$, $Omega_Mh^2$, $z_{eq}$ and $N_{eff}$. We find that BOSS should substantially improve upon the WiggleZ turnover constraint, reaching a precision on $k_0$ of $pm$9% (68% confidence), translating to precisions on $Omega_Mh^2$ and $z_{eq}$ of $pm$10% (assuming a prior $N_{eff}=3$) and on $N_{eff}$ of (+78/-56)% (assuming a prior $Omega_Mh^2=0.135$). This is sufficient precision to sharpen the constraints on $N_{eff}$ from WMAP, particularly in its upper limit. For Euclid, we find corresponding attainable precisions on $(k_0, Omega_Mh^2, N_eff)$ of (3,4,+17/-21)%. This represents a precision approaching our forecasts for the Planck Surveyor.
This paper presents cosmological results from the final data release of the WiggleZ Dark Energy Survey. We perform full analyses of different cosmological models using the WiggleZ power spectra measured at z=0.22, 0.41, 0.60, and 0.78, combined with other cosmological datasets. The limiting factor in this analysis is the theoretical modelling of the galaxy power spectrum, including non-linearities, galaxy bias, and redshift-space distortions. In this paper we assess several different methods for modelling the theoretical power spectrum, testing them against the Gigaparsec WiggleZ simulations (GiggleZ). We fit for a base set of 6 cosmological parameters, {Omega_b h^2, Omega_CDM h^2, H_0, tau, A_s, n_s}, and 5 supplementary parameters {n_run, r, w, Omega_k, sum m_nu}. In combination with the Cosmic Microwave Background (CMB), our results are consistent with the LambdaCDM concordance cosmology, with a measurement of the matter density of Omega_m =0.29 +/- 0.016 and amplitude of fluctuations sigma_8 = 0.825 +/- 0.017. Using WiggleZ data with CMB and other distance and matter power spectra data, we find no evidence for any of the extension parameters being inconsistent with their LambdaCDM model values. The power spectra data and theoretical modelling tools are available for use as a module for CosmoMC, which we here make publicly available at http://smp.uq.edu.au/wigglez-data . We also release the data and random catalogues used to construct the baryon acoustic oscillation correlation function.
The absolute neutrino mass scale is currently unknown, but can be constrained from cosmology. The WiggleZ high redshift star-forming blue galaxy sample is less sensitive to systematics from non-linear structure formation, redshift-space distortions a nd galaxy bias than previous surveys. We obtain a upper limit on the sum of neutrino masses of 0.60eV (95% confidence) for WiggleZ+Wilkinson Microwave Anisotropy Probe. Combining with priors on the Hubble Parameter and the baryon acoustic oscillation scale gives an upper limit of 0.29eV, which is the strongest neutrino mass constraint derived from spectroscopic galaxy redshift surveys.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا