Do you want to publish a course? Click here

Bayesian Estimation Applied to Multiple Species: Towards cosmology with a million supernovae

38   0   0.0 ( 0 )
 Added by Ren\\'ee Hlozek
 Publication date 2006
  fields Physics
and research's language is English
 Authors Martin Kunz n




Ask ChatGPT about the research

Observed data is often contaminated by undiscovered interlopers, leading to biased parameter estimation. Here we present BEAMS (Bayesian Estimation Applied to Multiple Species) which significantly improves on the standard maximum likelihood approach in the case where the probability for each data point being `pure is known. We discuss the application of BEAMS to future Type Ia supernovae (SNIa) surveys, such as LSST, which are projected to deliver over a million supernovae lightcurves without spectra. The multi-band lightcurves for each candidate will provide a probability of being Ia (pure) but the full sample will be significantly contaminated with other types of supernovae and transients. Given a sample of N supernovae with mean probability, P, of being Ia, BEAMS delivers parameter constraints equal to NP spectroscopically-confirmed SNIa. In addition BEAMS can be simultaneously used to tease apart different families of data and to recover properties of the underlying distributions of those families (e.g. the Type Ibc and II distributions). Hence BEAMS provides a unified classification and parameter estimation methodology which may be useful in a diverse range of problems such as photometric redshift estimation or, indeed, any parameter estimation problem where contamination is an issue.

rate research

Read More

We predict cosmological constraints for forthcoming surveys using Superluminous Supernovae (SLSNe) as standardisable candles. Due to their high peak luminosity, these events can be observed to high redshift (z~3), opening up new possibilities to probe the Universe in the deceleration epoch. We describe our methodology for creating mock Hubble diagrams for the Dark Energy Survey (DES), the Search Using DECam for Superluminous Supernovae (SUDSS) and a sample of SLSNe possible from the Large Synoptic Survey Telescope (LSST), exploring a range of standardisation values for SLSNe. We include uncertainties due to gravitational lensing and marginalise over possible uncertainties in the magnitude scale of the observations (e.g. uncertain absolute peak magnitude, calibration errors). We find that the addition of only ~100 SLSNe from SUDSS to 3800 Type Ia Supernovae (SNe Ia) from DES can improve the constraints on w and Omega_m by at least 20% (assuming a flat wCDM universe). Moreover, the combination of DES SNe Ia and 10,000 LSST-like SLSNe can measure Omega_m and w to 2% and 4% respectively. The real power of SLSNe becomes evident when we consider possible temporal variations in w(a), giving possible uncertainties of only 2%, 5% and 14% on Omega_m, w_0 and w_a respectively, from the combination of DES SNe Ia, LSST-like SLSNe and Planck. These errors are competitive with predicted Euclid constraints, indicating a future role for SLSNe for probing the high redshift Universe.
Achieving ultimate bounds in estimation processes is the main objective of quantum metrology. In this context, several problems require measurement of multiple parameters by employing only a limited amount of resources. To this end, adaptive protocols, exploiting additional control parameters, provide a tool to optimize the performance of a quantum sensor to work in such limited data regime. Finding the optimal strategies to tune the control parameters during the estimation process is a non-trivial problem, and machine learning techniques are a natural solution to address such task. Here, we investigate and implement experimentally for the first time an adaptive Bayesian multiparameter estimation technique tailored to reach optimal performances with very limited data. We employ a compact and flexible integrated photonic circuit, fabricated by femtosecond laser writing, which allows to implement different strategies with high degree of control. The obtained results show that adaptive strategies can become a viable approach for realistic sensors working with a limited amount of resources.
A Bayesian nonparametric estimator to entropy is proposed. The derivation of the new estimator relies on using the Dirichlet process and adapting the well-known frequentist estimators of Vasicek (1976) and Ebrahimi, Pflughoeft and Soofi (1994). Several theoretical properties, such as consistency, of the proposed estimator are obtained. The quality of the proposed estimator has been investigated through several examples, in which it exhibits excellent performance.
Filtration of feed containing multiple species of particles is a common process in the industrial setting. In this work we propose a model for filtration of a suspension containing an arbitrary number of particle species, each with different affinities for the filter membrane. We formulate a number of optimization problems pertaining to effective separation of desired and undesired particles in the special case of two particle species and we present results showing how properties such as feed composition affect the optimal filter design (internal pore structure). In addition, we propose a novel multi-stage filtration strategy, which provides a significant mass yield improvement for the desired particles, and, surprisingly, higher purity of the product as well.
Generative reconstruction methods compute the 3D configuration (such as pose and/or geometry) of a shape by optimizing the overlap of the projected 3D shape model with images. Proper handling of occlusions is a big challenge, since the visibility function that indicates if a surface point is seen from a camera can often not be formulated in closed form, and is in general discrete and non-differentiable at occlusion boundaries. We present a new scene representation that enables an analytically differentiable closed-form formulation of surface visibility. In contrast to previous methods, this yields smooth, analytically differentiable, and efficient to optimize pose similarity energies with rigorous occlusion handling, fewer local minima, and experimentally verified improved convergence of numerical optimization. The underlying idea is a new image formation model that represents opaque objects by a translucent medium with a smooth Gaussian density distribution which turns visibility into a smooth phenomenon. We demonstrate the advantages of our versatile scene model in several generative pose estimation problems, namely marker-less multi-object pose estimation, marker-less human motion capture with few cameras, and image-based 3D geometry estimation.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا