Do you want to publish a course? Click here

Discrete Chi-square Method for Detecting Many Signals

151   0   0.0 ( 0 )
 Added by Lauri Jetsu
 Publication date 2020
and research's language is English
 Authors Lauri Jetsu




Ask ChatGPT about the research

Unambiguous detection of signals superimposed on unknown trends is difficult for unevenly spaced data. Here, we formulate the Discrete Chi-square Method (DCM) that can determine the best model for many signals superimposed on arbitrary polynomial trends. DCM minimizes the Chi-square for the data in the multi-dimensional tested frequency space. The required number of tested frequency combinations remains manageable, because the method test statistic is symmetric in this tested frequency space. With our known tested constant frequency grid values, the non-linear DCM model becomes linear, and all results become unambiguous. We test DCM with simulated data containing different mixtures of signals and trends. DCM gives unambiguous results, if the signal frequencies are not too close to each other, and none of the signals is too weak. It relies on brute computational force, because all possible free parameter combinations for all reasonable linear models are tested. DCM works like winning a lottery by buying all lottery tickets. Anyone can reproduce all our results with the DCM computer code. All files, variables and other program code related items are printed in magenta colour. Our Appendix gives detailed instructions for using dcm.py. We also present one preliminary real use case, where DCM is applied to the observed (O) minus the computed (C) eclipse epochs of a binary star, XZ And. This DCM analysis reveals evidence for the possible presence of a third and a fourth body in this system. One recent study of a very large sample of binary stars indicated that the probability for detecting a fourth body from the O-C data of eclipsing binaries is only about 0.00005.



rate research

Read More

In this article we derive an unbiased expression for the expected mean-squared error associated with continuously differentiable estimators of the noncentrality parameter of a chi-square random variable. We then consider the task of denoising squared-magnitude magnetic resonance image data, which are well modeled as independent noncentral chi-square random variables on two degrees of freedom. We consider two broad classes of linearly parameterized shrinkage estimators that can be optimized using our risk estimate, one in the general context of undecimated filterbank transforms, and another in the specific case of the unnormalized Haar wavelet transform. The resultant algorithms are computationally tractable and improve upon state-of-the-art methods for both simulated and actual magnetic resonance image data.
We apply the holonomic gradient method to compute the distribution function of a weighted sum of independent noncentral chi-square random variables. It is the distribution function of the squared length of a multivariate normal random vector. We treat this distribution as an integral of the normalizing constant of the Fisher-Bingham distribution on the unit sphere and make use of the partial differential equations for the Fisher-Bingham distribution.
54 - Alisha Chromey 2019
Gamma ray observations from a few hundred MeV up to tens of TeV are a valuable tool for studying particle acceleration and diffusion within our galaxy. Constructing a coherent physical picture of particle accelerators such as supernova remnants, pulsar wind nebulae, and star-forming regions requires the ability to detect extended regions of gamma ray emission, to analyze small-scale spatial variation within these regions, and to synthesize data from multiple observatories across multiple wavebands. Imaging atmospheric Cherenkov telescopes (IACTs) provide fine angular resolution (<0.1$^circ$) for gamma rays above 100 GeV. However, their limited fields of view typically make detection of extended sources challenging. Maximum likelihood methods are well-suited to simultaneous analysis of multiple fields with overlapping sources and to combining data from multiple gamma ray observatories. Such methods also offer an alternative approach to estimating the IACT cosmic ray background and consequently an enhanced sensitivity to sources that may be as large as the telescope field of view. We report here on the current status and performance of a maximum likelihood technique for the IACT VERITAS.
248 - Ramin Zahedi , Ali Pezeshki , 2011
We consider the problem of testing for the presence (or detection) of an unknown sparse signal in additive white noise. Given a fixed measurement budget, much smaller than the dimension of the signal, we consider the general problem of designing compressive measurements to maximize the measurement signal-to-noise ratio (SNR), as increasing SNR improves the detection performance in a large class of detectors. We use a lexicographic optimization approach, where the optimal measurement design for sparsity level $k$ is sought only among the set of measurement matrices that satisfy the optimality conditions for sparsity level k-1. We consider optimizing two different SNR criteria, namely a worst-case SNR measure, over all possible realizations of a k-sparse signal, and an average SNR measure with respect to a uniform distribution on the locations of the up to k nonzero entries in the signal. We establish connections between these two criteria and certain classes of tight frames. We constrain our measurement matrices to the class of tight frames to avoid coloring the noise covariance matrix. For the worst-case problem, we show that the optimal measurement matrix is a Grassmannian line packing for most---and a uniform tight frame for all---sparse signals. For the average SNR problem, we prove that the optimal measurement matrix is a uniform tight frame with minimum sum-coherence for most---and a tight frame for all---sparse signals.
356 - A. Barreira , P. P. Avelino 2011
In this paper we compare the performances of the chi-square and median likelihood analysis in the determination of cosmological constraints using type Ia supernovae data. We perform a statistical analysis using the 307 supernovae of the Union 2 compilation of the Supernova Cosmology Project and find that the chi-square statistical analysis yields tighter cosmological constraints than the median statistic if only supernovae data is taken into account. We also show that when additional measurements from the Cosmic Microwave Background and Baryonic Acoustic Oscillations are considered, the combined cosmological constraints are not strongly dependent on whether one applies the chi-square statistic or the median statistic to the supernovae data. This indicates that, when complementary information from other cosmological probes is taken into account, the performances of the chi-square and median statistics are very similar, demonstrating the robustness of the statistical analysis.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا