Do you want to publish a course? Click here

The Multi-Mission Maximum Likelihood framework (3ML)

66   0   0.0 ( 0 )
 Added by Giacomo Vianello
 Publication date 2015
  fields Physics
and research's language is English




Ask ChatGPT about the research

Astrophysical sources are now observed by many different instruments at different wavelengths, from radio to high-energy gamma-rays, with an unprecedented quality. Putting all these data together to form a coherent view, however, is a very difficult task. Each instrument has its own data format, software and analysis procedure, which are difficult to combine. It is for example very challenging to perform a broadband fit of the energy spectrum of the source. The Multi-Mission Maximum Likelihood framework (3ML) aims to solve this issue, providing a common framework which allows for a coherent modeling of sources using all the available data, independent of their origin. At the same time, thanks to its architecture based on plug-ins, 3ML uses the existing official software of each instrument for the corresponding data in a way which is transparent to the user. 3ML is based on the likelihood formalism, in which a model summarizing our knowledge about a particular region of the sky is convolved with the instrument response and compared to the corresponding data. The user can choose between a frequentist analysis, and a Bayesian analysis. In the former, parameters of the model are optimized in order to obtain the best match to the data (i.e., the maximum of the likelihood). In the latter, the priors specified by the user are used to build the posterior distribution, which is then sampled with Markov Chain Monte Carlo or Multinest. Our implementation of this idea is very flexible, allowing the study of point sources as well as extended sources with arbitrary spectra. We will review the problem we aim to solve, the 3ML concepts and its innovative potential.



rate research

Read More

53 - Alisha Chromey 2021
Gamma-ray observations ranging from hundreds of MeV to tens of TeV are a valuable tool for studying particle acceleration and diffusion within our galaxy. Supernova remnants, pulsar wind nebulae, and star-forming regions are the main particle accelerators in our local Galaxy. Constructing a coherent physical picture of these astrophysical objects requires the ability to distinguish extended regions of gamma-ray emission, the ability to analyze small-scale spatial variation within these regions, and methods to synthesize data from multiple observatories across multiple wavebands. Imaging Atmospheric Cherenkov Telescopes (IACTs) provide fine angular resolution (<0.1 degree) for gamma-rays above 100 GeV. Typical data reduction methods rely on source-free regions in the field of view to estimate cosmic-ray background. This presents difficulties for sources with unknown extent or those which encompass a large portion of the IACT field of view (3.5 degrees for VERITAS). Maximum-likelihood-based techniques are well-suited for analysis of fields with multiple overlapping sources, diffuse background components, and combining data from multiple observatories. Such methods also offer an alternative approach to estimating the IACT cosmic-ray background and consequently an enhanced sensitivity to largely extended sources. In this proceeding, we report on the current status and performance of a maximum likelihood technique for the IACT VERITAS. In particular, we focus on how our method framework employs a dimension for gamma-hadron separation parameters in order to improve sensitivity on extended sources.
We consider a sample of $82$ non-repeating FRBs detected at Parkes, ASKAP, CHIME and UTMOST each of which operates over a different frequency range and has a different detection criteria. Using simulations, we perform a maximum likelihood analysis to determine the FRB population model which best fits this data. Our analysis shows that models where the pulse scatter broadening increases moderately with redshift ($z$) are preferred over those where this increases very sharply or where scattering is absent. Further, models where the comoving event rate density is constant over $z$ are preferred over those where it follows the cosmological star formation rate. Two models for the host dispersion measure ($DM_{rm host}$) distribution (a fixed and a random $DM_{rm host}$) are found to predict comparable results. We obtain the best fit parameter values $alpha=-1.53^{+0.29}_{-0.19}$, $overline{E}_{33}=1.55^{+0.26}_{-0.22}$ and $gamma=0.77pm 0.24$. Here $alpha$ is the spectral index, $gamma$ is the exponent of the Schechter luminosity function and $overline{E}_{33}$ is the mean FRB energy in units of $10^{33} , {rm J}$ across $2128 - 2848; {rm MHz}$ in the FRB rest frame.
121 - Matwey V. Kornilov 2019
We present a novel technique for estimating disk parameters (the centre and the radius) from its 2D image. It is based on the maximal likelihood approach utilising both edge pixels coordinates and the image intensity gradients. We emphasise the following advantages of our likelihood model. It has closed-form formulae for parameter estimating, requiring less computational resources than iterative algorithms therefore. The likelihood model naturally distinguishes the outer and inner annulus edges. The proposed technique was evaluated on both synthetic and real data.
We report on the results of a search for a Weakly Interacting Massive Particle (WIMP) signal in low-energy data of the Cryogenic Dark Matter Search (CDMS~II) experiment using a maximum likelihood analysis. A background model is constructed using GEANT4 to simulate the surface-event background from $^{210}$Pb decay-chain events, while using independent calibration data to model the gamma background. Fitting this background model to the data results in no statistically significant WIMP component. In addition, we perform fits using an analytic ad hoc background model proposed by Collar and Fields, who claimed to find a large excess of signal-like events in our data. We confirm the strong preference for a signal hypothesis in their analysis under these assumptions, but excesses are observed in both single- and multiple-scatter events, which implies the signal is not caused by WIMPs, but rather reflects the inadequacy of their background model.
We investigate the impact of instrumental systematic errors in interferometric measurements of the cosmic microwave background (CMB) temperature and polarization power spectra. We simulate interferometric CMB observations to generate mock visibilities and estimate power spectra using the statistically optimal maximum likelihood technique. We define a quadratic error measure to determine allowable levels of systematic error that do not induce power spectrum errors beyond a given tolerance. As an example, in this study we focus on differential pointing errors. The effects of other systematics can be simulated by this pipeline in a straightforward manner. We find that, in order to accurately recover the underlying B-modes for r=0.01 at 28<l<384, Gaussian-distributed pointing errors must be controlled to 0.7^circ rms for an interferometer with an antenna configuration similar to QUBIC, in agreement with analytical estimates. Only the statistical uncertainty for 28<l<88 would be changed at ~10% level. With the same instrumental configuration, we find the pointing errors would slightly bias the 2-sigma upper limit of the tensor-to-scalar ratio r by ~10%. We also show that the impact of pointing errors on the TB and EB measurements is negligibly small.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا