Do you want to publish a course? Click here

Improved signal detection algorithms for unevenly sampled data. Six signals in the radial velocity data for GJ876

264   0   0.0 ( 0 )
 Added by James Jenkins Dr
 Publication date 2014
  fields Physics
and research's language is English




Ask ChatGPT about the research

The hunt for Earth analogue planets orbiting Sun-like stars has forced the introduction of novel methods to detect signals at, or below, the level of the intrinsic noise of the observations. We present a new global periodogram method that returns more information than the classic Lomb-Scargle periodogram method for radial velocity signal detection. Our method uses the Minimum Mean Squared Error as a framework to determine the optimal number of genuine signals present in a radial velocity timeseries using a global search algorithm, meaning we can discard noise spikes from the data before follow-up analysis. This method also allows us to determine the phase and amplitude of the signals we detect, meaning we can track these quantities as a function of time to test if the signals are stationary or non-stationary. We apply our method to the radial velocity data for GJ876 as a test system to highlight how the phase information can be used to select against non-stationary sources of detected signals in radial velocity data, such as rotational modulation of star spots. Analysis of this system yields two new statistically significant signals in the combined Keck and HARPS velocities with periods of 10 and 15 days. Although a planet with a period of 15 days would relate to a Laplace resonant chain configuration with three of the other planets (8:4:2:1), we stress that follow-up dynamical analyses are needed to test the reliability of such a six planet system.



rate research

Read More

We present results from a data challenge posed to the radial velocity (RV) community: namely, to quantify the Bayesian evidence for n={0,1,2,3} planets in a set of synthetically generated RV datasets containing a range of planet signals. Participating teams were provided the same likelihood function and set of priors to use in their analysis. They applied a variety of methods to estimate Z, the marginal likelihood for each n-planet model, including cross-validation, the Laplace approximation, importance sampling, and nested sampling. We found the dispersion in Z across different methods grew with increasing n-planet models: ~3 for 0-planets, ~10 for 1-planet, ~100-1000 for 2-planets, and >10,000 for 3-planets. Most internal estimates of uncertainty in Z for individual methods significantly underestimated the observed dispersion across all methods. Methods that adopted a Monte Carlo approach by comparing estimates from multiple runs yielded plausible uncertainties. Finally, two classes of numerical algorithms (those based on importance and nested samplers) arrived at similar conclusions regarding the ratio of Zs for n and (n+1)-planet models. One analytic method (the Laplace approximation) demonstrated comparable performance. We express both optimism and caution: we demonstrate that it is practical to perform rigorous Bayesian model comparison for <=3-planet models, yet robust planet discoveries require researchers to better understand the uncertainty in Z and its connections to model selection.
66 - N. C. Hara , G. Boue , J. Laskar 2019
Eccentricity is a parameter of particular interest as it is an informative indicator of the past of planetary systems. It is however not always clear whether the eccentricity fitted on radial velocity data is real or if it is an artefact of an inappropriate modelling. In this work, we address this question in two steps: we first assume that the model used for inference is correct and present interesting features of classical estimators. Secondly, we study whether the eccentricity estimates are to be trusted when the data contain incorrectly modelled signals, such as missed planetary companions, non Gaussian noises, correlated noises with unknown covariance, etc. Our main conclusion is that data analysis via posterior distributions, with a model including a free error term gives reliable results provided two conditions. First, convergence of the numerical methods needs to be ascertained. Secondly, the noise power spectrum should not have a particularly strong peak at the semi period of the planet of interest. As a consequence, it is difficult to determine if the signal of an apparently eccentric planet might be due to another inner companion in 2:1 mean motion resonance. We study the use of Bayes factors to disentangle these cases. Finally, we suggest methods to check if there are hints of an incorrect model in the residuals. We show on simulated data the performance of our methods and comment on the eccentricities of Proxima b and 55 Cnc f.
93 - Mikko Tuomi 2013
Bayesian data analysis techniques, together with suitable statistical models, can be used to obtain much more information from noisy data than the traditional frequentist methods. For instance, when searching for periodic signals in noisy data, the Bayesian techniques can be used to define exact detection criteria for low-amplitude signals - the most interesting signals that might correspond to habitable planets. We present an overview of Bayesian techniques and present detailed analyses of the HARPS-TERRA velocities of HD 40307, a nearby star observed to host a candidate habitable planet, to demonstrate in practice the applicability of Bayes rule to astronomical data.
The fully marginalized likelihood, or Bayesian evidence, is of great importance in probabilistic data analysis, because it is involved in calculating the posterior probability of a model or re-weighting a mixture of models conditioned on data. It is, however, extremely challenging to compute. This paper presents a geometric-path Monte Carlo method, inspired by multi-canonical Monte Carlo to evaluate the fully marginalized likelihood. We show that the algorithm is very fast and easy to implement and produces a justified uncertainty estimate on the fully marginalized likelihood. The algorithm performs efficiently on a trial problem and multi-companion model fitting for radial velocity data. For the trial problem, the algorithm returns the correct fully marginalized likelihood, and the estimated uncertainty is also consistent with the standard deviation of results from multiple runs. We apply the algorithm to the problem of fitting radial velocity data from HIP 88048 ($ u$ Oph) and Gliese 581. We evaluate the fully marginalized likelihood of 1, 2, 3, and 4-companion models given data from HIP 88048 and various choices of prior distributions. We consider prior distributions with three different minimum radial velocity amplitude $K_{mathrm{min}}$. Under all three priors, the 2-companion model has the largest marginalized likelihood, but the detailed values depend strongly on $K_{mathrm{min}}$. We also evaluate the fully marginalized likelihood of 3, 4, 5, and 6-planet model given data from Gliese 581 and find that the fully marginalized likelihood of the 5-planet model is too close to that of the 6-planet model for us to confidently decide between them.
Astronomical surveys of celestial sources produce streams of noisy time series measuring flux versus time (light curves). Unlike in many other physical domains, however, large (and source-specific) temporal gaps in data arise naturally due to intranight cadence choices as well as diurnal and seasonal constraints. With nightly observations of millions of variable stars and transients from upcoming surveys, efficient and accurate discovery and classification techniques on noisy, irregularly sampled data must be employed with minimal human-in-the-loop involvement. Machine learning for inference tasks on such data traditionally requires the laborious hand-coding of domain-specific numerical summaries of raw data (features). Here we present a novel unsupervised autoencoding recurrent neural network (RNN) that makes explicit use of sampling times and known heteroskedastic noise properties. When trained on optical variable star catalogs, this network produces supervised classification models that rival other best-in-class approaches. We find that autoencoded features learned on one time-domain survey perform nearly as well when applied to another survey. These networks can continue to learn from new unlabeled observations and may be used in other unsupervised tasks such as forecasting and anomaly detection.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا