Do you want to publish a course? Click here

SigSpec - I. Frequency- and Phase-Resolved Significance in Fourier Space

57   0   0.0 ( 0 )
 Added by Piet Reegen
 Publication date 2007
  fields Physics
and research's language is English
 Authors P. Reegen




Ask ChatGPT about the research

Identifying frequencies with low signal-to-noise ratios in time series of stellar photometry and spectroscopy, and measuring their amplitude ratios and peak widths accurately, are critical goals for asteroseismology. These are also challenges for time series with gaps or whose data are not sampled at a constant rate, even with modern Discrete Fourier Transform (DFT) software. Also the False-Alarm Probability introduced by Lomb and Scargle is an approximation which becomes less reliable in time series with longer data gaps. A rigorous statistical treatment of how to determine the significance of a peak in a DFT, called SigSpec, is presented here. SigSpec is based on an analytical solution of the probability that a DFT peak of a given amplitude does not arise from white noise in a non-equally spaced data set. The underlying Probability Density Function (PDF) of the amplitude spectrum generated by white noise can be derived explicitly if both frequency and phase are incorporated into the solution. In this paper, I define and evaluate an unbiased statistical estimator, the spectral significance, which depends on frequency, amplitude, and phase in the DFT, and which takes into account the time-domain sampling. I also compare this estimator to results from other well established techniques and demonstrate the effectiveness of SigSpec with a few examples of ground- and space-based photometric data, illustratring how SigSpec deals with the effects of noise and time-domain sampling in determining significant frequencies.



rate research

Read More

146 - Louis Lyons 2013
We discuss the traditional criterion for discovery in Particle Physics of requiring a significance corresponding to at least 5 sigma; and whether a more nuanced approach might be better.
154 - G.Vianello 2017
Several experiments in high-energy physics and astrophysics can be treated as on/off measurements, where an observation potentially containing a new source or effect (on measurement) is contrasted with a background-only observation free of the effect (off measurement). In counting experiments, the significance of the new source or effect can be estimated with a widely-used formula from [LiMa], which assumes that both measurements are Poisson random variables. In this paper we study three other cases: i) the ideal case where the background measurement has no uncertainty, which can be used to study the maximum sensitivity that an instrument can achieve, ii) the case where the background estimate $b$ in the off measurement has an additional systematic uncertainty, and iii) the case where $b$ is a Gaussian random variable instead of a Poisson random variable. The latter case applies when $b$ comes from a model fitted on archival or ancillary data, or from the interpolation of a function fitted on data surrounding the candidate new source/effect. Practitioners typically use in this case a formula which is only valid when $b$ is large and when its uncertainty is very small, while we derive a general formula that can be applied in all regimes. We also develop simple methods that can be used to assess how much an estimate of significance is sensitive to systematic uncertainties on the efficiency or on the background. Examples of applications include the detection of short Gamma-Ray Bursts and of new X-ray or $gamma$-ray sources.
102 - Gioacchino Ranucci 2005
The experimental issue of the search for new particles of unknown mass poses the challenge of exploring a wide interval to look for the usual signatures represented by excess of events above the background. A side effect of such a broad range quest is that the traditional significance calculations valid for signals of known location are no more applicable when such an information is missing. In this note the specific signal search approach via observation windows sliding over the range of interest is considered; in the assumptions of known background and of fixed width of the exploring windows the statistical implications of such a search scheme are described, with special emphasis on the correct significance assessment for a claimed discovery.
87 - I. Narsky 2005
An algorithm for optimization of signal significance or any other classification figure of merit suited for analysis of high energy physics (HEP) data is described. This algorithm trains decision trees on many bootstrap replicas of training data with each tree required to optimize the signal significance or any other chosen figure of merit. New data are then classified by a simple majority vote of the built trees. The performance of this algorithm has been studied using a search for the radiative leptonic decay B->gamma l nu at BaBar and shown to be superior to that of all other attempted classifiers including such powerful methods as boosted decision trees. In the B->gamma e nu channel, the described algorithm increases the expected signal significance from 2.4 sigma obtained by an original method designed for the B->gamma l nu analysis to 3.0 sigma.
The MOST (Microvariability & Oscillations of STars) satellite obtains ultraprecise photometry from space with high sampling rates and duty cycles. Astronomical photometry or imaging missions in low Earth orbits, like MOST, are especially sensitive to scattered light from Earthshine, and all these missions have a common need to extract target information from voluminous data cubes. They consist of upwards of hundreds of thousands of two-dimensional CCD frames (or sub-rasters) containing from hundreds to millions of pixels each, where the target information, superposed on background and instrumental effects, is contained only in a subset of pixels (Fabry Images, defocussed images, mini-spectra). We describe a novel reduction technique for such data cubes: resolving linear correlations of target and background pixel intensities. This stepwise multiple linear regression removes only those target variations which are also detected in the background. The advantage of regression analysis versus background subtraction is the appropriate scaling, taking into account that the amount of contamination may differ from pixel to pixel. The multivariate solution for all pairs of target/background pixels is minimally invasive of the raw photometry while being very effective in reducing contamination due to, e.g., stray light. The technique is tested and demonstrated with both simulated oscillation signals and real MOST photometry.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا