Do you want to publish a course? Click here

Large-scale three-dimensional Gaussian process extinction mapping

256   0   0.0 ( 0 )
 Added by Stuart Sale
 Publication date 2017
  fields Physics
and research's language is English




Ask ChatGPT about the research

Gaussian processes are the ideal tool for modelling the Galactic ISM, combining statistical flexibility with a good match to the underlying physics. In an earlier paper we outlined how they can be employed to construct three-dimensional maps of dust extinction from stellar surveys. Gaussian processes scale poorly to large datasets though, which put the analysis of realistic catalogues out of reach. Here we show how a novel combination of the Expectation Propagation method and certain sparse matrix approximations can be used to accelerate the dust mapping problem. We demonstrate, using simulated Gaia data, that the resultant algorithm is fast, accurate and precise. Critically, it can be scaled up to map the Gaia catalogue.



rate research

Read More

320 - S. E. Sale , J. Magorrian 2014
We present a scheme for using stellar catalogues to map the three-dimensional distributions of extinction and dust within our Galaxy. Extinction is modelled as a Gaussian random field, whose covariance function is set by a simple physical model of the ISM that assumes a Kolmogorov-like power spectrum of turbulent fluctuations. As extinction is modelled as a random field, the spatial resolution of the resulting maps is set naturally by the data available; there is no need to impose any spatial binning. We verify the validity of our scheme by testing it on simulated extinction fields and show that its precision is significantly improved over previous dust-mapping efforts. The approach we describe here can make use of any photometric, spectroscopic or astrometric data; it is not limited to any particular survey. Consequently, it can be applied to a wide range of data from both existing and future surveys.
136 - S. E. Sale 2015
Selection effects can bedevil the inference of the properties of a population of astronomical catalogues, unavoidably biasing the observed catalogue. This is particularly true when mapping interstellar extinction in three dimensions: more extinguished stars are fainter and so generally less likely to appear in any magnitude limited catalogue of observations. This paper demonstrates how to account for this selection effect when mapping extinction, so that accurate and unbiased estimates of the true extinction are obtained. We advocate couching the description of the problem explicitly as a Poisson point process, which allows the likelihoods employed to be easily and correctly normalised in such a way that accounts for the selection functions applied to construct the catalogue of observations.
We demonstrate applications of the Gaussian process-based landmarking algorithm proposed in [T. Gao, S.Z. Kovalsky, and I. Daubechies, SIAM Journal on Mathematics of Data Science (2019)] to geometric morphometrics, a branch of evolutionary biology centered at the analysis and comparisons of anatomical shapes, and compares the automatically sampled landmarks with the ground truth landmarks manually placed by evolutionary anthropologists; the results suggest that Gaussian process landmarks perform equally well or better, in terms of both spatial coverage and downstream statistical analysis. We provide a detailed exposition of numerical procedures and feature filtering algorithms for computing high-quality and semantically meaningful diffeomorphisms between disk-type anatomical surfaces.
Accurate photometric redshifts are a lynchpin for many future experiments to pin down the cosmological model and for studies of galaxy evolution. In this study, a novel sparse regression framework for photometric redshift estimation is presented. Simulated and real data from SDSS DR12 were used to train and test the proposed models. We show that approaches which include careful data preparation and model design offer a significant improvement in comparison with several competing machine learning algorithms. Standard implementations of most regression algorithms have as the objective the minimization of the sum of squared errors. For redshift inference, however, this induces a bias in the posterior mean of the output distribution, which can be problematic. In this paper we directly target minimizing $Delta z = (z_textrm{s} - z_textrm{p})/(1+z_textrm{s})$ and address the bias problem via a distribution-based weighting scheme, incorporated as part of the optimization objective. The results are compared with other machine learning algorithms in the field such as Artificial Neural Networks (ANN), Gaussian Processes (GPs) and sparse GPs. The proposed framework reaches a mean absolute $Delta z = 0.0026(1+z_textrm{s})$, over the redshift range of $0 le z_textrm{s} le 2$ on the simulated data, and $Delta z = 0.0178(1+z_textrm{s})$ over the entire redshift range on the SDSS DR12 survey, outperforming the standard ANNz used in the literature. We also investigate how the relative size of the training set affects the photometric redshift accuracy. We find that a training set of textgreater 30 per cent of total sample size, provides little additional constraint on the photometric redshifts, and note that our GP formalism strongly outperforms ANNz in the sparse data regime for the simulated data set.
This work presents a new physically-motivated supervised machine learning method, Hydro-BAM, to reproduce the three-dimensional Lyman-$alpha$ forest field in real and in redshift space learning from a reference hydrodynamic simulation, thereby saving about 7 orders of magnitude in computing time. We show that our method is accurate up to $ksim1,h,rm{Mpc}^{-1}$ in the one- (PDF), two- (power-spectra) and three-point (bi-spectra) statistics of the reconstructed fields. When compared to the reference simulation including redshift space distortions, our method achieves deviations of $lesssim2%$ up to $k=0.6,h,rm{Mpc}^{-1}$ in the monopole, $lesssim5%$ up to $k=0.9,h,rm{Mpc}^{-1}$ in the quadrupole. The bi-spectrum is well reproduced for triangle configurations with sides up to $k=0.8,h,rm{Mpc}^{-1}$. In contrast, the commonly-adopted Fluctuating Gunn-Peterson approximation shows significant deviations already neglecting peculiar motions at configurations with sides of $k=0.2-0.4,h,rm{Mpc}^{-1}$ in the bi-spectrum, being also significantly less accurate in the power-spectrum (within 5$%$ up to $k=0.7,h,rm{Mpc}^{-1}$). We conclude that an accurate analysis of the Lyman-$alpha$ forest requires considering the complex baryonic thermodynamical large-scale structure relations. Our hierarchical domain specific machine learning method can efficiently exploit this and is ready to generate accurate Lyman-$alpha$ forest mock catalogues covering large volumes required by surveys such as DESI and WEAVE.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا