ترغب بنشر مسار تعليمي؟ اضغط هنا

exoplanet: Gradient-based probabilistic inference for exoplanet data & other astronomical time series

103   0   0.0 ( 0 )
 نشر من قبل Daniel Foreman-Mackey
 تاريخ النشر 2021
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

exoplanet is a toolkit for probabilistic modeling of astronomical time series data, with a focus on observations of exoplanets, using PyMC3 (Salvatier et al., 2016). PyMC3 is a flexible and high-performance model-building language and inference engine that scales well to problems with a large number of parameters. exoplanet extends PyMC3s modeling language to support many of the custom functions and probability distributions required when fitting exoplanet datasets or other astronomical time series. While it has been used for other applications, such as the study of stellar variability, the primary purpose of exoplanet is the characterization of exoplanets or multiple star systems using time-series photometry, astrometry, and/or radial velocity. In particular, the typical use case would be to use one or more of these datasets to place constraints on the physical and orbital parameters of the system, such as planet mass or orbital period, while simultaneously taking into account the effects of stellar variability.



قيم البحث

اقرأ أيضاً

546 - R.L. Akeson , X. Chen , D. Ciardi 2013
We describe the contents and functionality of the NASA Exoplanet Archive, a database and tool set funded by NASA to support astronomers in the exoplanet community. The current content of the database includes interactive tables containing properties of all published exoplanets, Kepler planet candidates, threshold-crossing events, data validation reports and target stellar parameters, light curves from the Kepler and CoRoT missions and from several ground-based surveys, and spectra and radial velocity measurements from the literature. Tools provided to work with these data include a transit ephemeris predictor, both for single planets and for observing locations, light curve viewing and normalization utilities, and a periodogram and phased light curve service. The archive can be accessed at http://exoplanetarchive.ipac.caltech.edu.
The Exoplanet Imaging Data Challenge is a community-wide effort meant to offer a platform for a fair and common comparison of image processing methods designed for exoplanet direct detection. For this purpose, it gathers on a dedicated repository (Ze nodo), data from several high-contrast ground-based instruments worldwide in which we injected synthetic planetary signals. The data challenge is hosted on the CodaLab competition platform, where participants can upload their results. The specifications of the data challenge are published on our website. The first phase, launched on the 1st of September 2019 and closed on the 1st of October 2020, consisted in detecting point sources in two types of common data-set in the field of high-contrast imaging: data taken in pupil-tracking mode at one wavelength (subchallenge 1, also referred to as ADI) and multispectral data taken in pupil-tracking mode (subchallenge 2, also referred to as ADI mSDI). In this paper, we describe the approach, organisational lessons-learnt and current limitations of the data challenge, as well as preliminary results of the participants submissions for this first phase. In the future, we plan to provide permanent access to the standard library of data sets and metrics, in order to guide the validation and support the publications of innovative image processing algorithms dedicated to high-contrast imaging of planetary systems.
The direct detection of exoplanets with high-contrast instruments can be boosted with high spectral resolution. For integral field spectrographs yielding hyperspectral data, this means that the field of view consists of diffracted starlight spectra a nd a spatially localized planet. Analysis usually relies on cross-correlation with theoretical spectra. In a purely blind-search context, this supervised strategy can be biased with model mismatch and/or be computationally inefficient. Using an approach that is inspired by the remote-sensing community, we aim to propose an alternative to cross-correlation that is fully data-driven, which decomposes the data into a set of individual spectra and their corresponding spatial distributions. This strategy is called spectral unmixing. We used an orthogonal subspace projection to identify the most distinct spectra in the field of view. Their spatial distribution maps were then obtained by inverting the data. These spectra were then used to break the original hyperspectral images into their corresponding spatial distribution maps via non-negative least squares. The performance of our method was evaluated and compared with a cross-correlation using simulated hyperspectral data with medium resolution from the ELT/HARMONI integral field spectrograph. We show that spectral unmixing effectively leads to a planet detection solely based on spectral dissimilarities at significantly reduced computational cost. The extracted spectrum holds significant signatures of the planet while being not perfectly separated from residual starlight. The sensitivity of the supervised cross-correlation is three to four times higher than with unsupervised spectral unmixing, the gap is biased toward the former because the injected and correlated spectrum match perfectly. The algorithm was furthermore vetted on real data obtained with VLT/SINFONI of the beta Pictoris system.
68 - I.J.M. Crossfield 2016
The study of extrasolar planets has rapidly expanded to encompass the search for new planets, measurements of sizes and masses, models of planetary interiors, planetary demographics and occurrence frequencies, the characterization of planetary orbits and dynamics, and studies of these worlds complex atmospheres. Our insights into exoplanets dramatically advance whenever improved tools and techniques become available, and surely the largest tools now being planned are the optical/infrared Extremely Large Telescopes (ELTs). Two themes summarize the advantages of atmospheric studies with the ELTs: high angular resolution when operating at the diffraction limit and high spectral resolution enabled by the unprecedented collecting area of these large telescopes. This brief review describes new opportunities afforded by the ELTs to study the composition, structure, dynamics, and evolution of these planets atmospheres, while specifically focusing on some of the most compelling atmospheric science cases for four qualitatively different planet populations: highly irradiated gas giants, young, hot giant planets, old, cold gas giants, and small planets and Earth analogs.
70 - Hannu Parviainen 2017
Exoplanet research is carried out at the limits of the capabilities of current telescopes and instruments. The studied signals are weak, and often embedded in complex systematics from instrumental, telluric, and astrophysical sources. Combining repea ted observations of periodic events, simultaneous observations with multiple telescopes, different observation techniques, and existing information from theory and prior research can help to disentangle the systematics from the planetary signals, and offers synergistic advantages over analysing observations separately. Bayesian inference provides a self-consistent statistical framework that addresses both the necessity for complex systematics models, and the need to combine prior information and heterogeneous observations. This chapter offers a brief introduction to Bayesian inference in the context of exoplanet research, with focus on time series analysis, and finishes with an overview of a set of freely available programming libraries.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا