ترغب بنشر مسار تعليمي؟ اضغط هنا

Unlocking starlight subtraction in full data rate exoplanet imaging by efficiently updating Karhunen-Lo`eve eigenimages

260   0   0.0 ( 0 )
 نشر من قبل Joseph Long
 تاريخ النشر 2021
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Starlight subtraction algorithms based on the method of Karhunen-Lo`eve eigenimages have proved invaluable to exoplanet direct imaging. However, they scale poorly in runtime when paired with differential imaging techniques. In such observations, reference frames and frames to be starlight-subtracted are drawn from the same set of data, requiring a new subset of references (and eigenimages) for each frame processed to avoid self-subtraction of the signal of interest. The data rates of extreme adaptive optics instruments are such that the only way to make this computationally feasible has been to downsample the data. We develop a technique that updates a pre-computed singular value decomposition of the full dataset to remove frames (i.e. a downdate) without a full recomputation, yielding the modified eigenimages. This not only enables analysis of much larger data volumes in the same amount of time, but also exhibits near-linear scaling in runtime as the number of observations increases. We apply this technique to archival data and investigate its scaling behavior for very large numbers of frames $N$. The resulting algorithm provides speed improvements of $2.6times$ (for 200 eigenimages at $N = 300$) to $140 times$ (at $N = 10^4$) with the advantage only increasing as $N$ grows. This algorithm has allowed us to substantially accelerate KLIP even for modest $N$, and will let us quickly explore how KLIP parameters affect exoplanet characterization in large $N$ datasets.



قيم البحث

اقرأ أيضاً

We present a new physics-informed machine learning approach for the inversion of PDE models with heterogeneous parameters. In our approach, the space-dependent partially-observed parameters and states are approximated via Karhunen-Lo`eve expansions ( KLEs). Each of these KLEs is then conditioned on their corresponding measurements, resulting in low-dimensional models of the parameters and states that resolve observed data. Finally, the coefficients of the KLEs are estimated by minimizing the norm of the residual of the PDE model evaluated at a finite set of points in the computational domain, ensuring that the reconstructed parameters and states are consistent with both the observations and the PDE model to an arbitrary level of accuracy. In our approach, KLEs are constructed using the eigendecomposition of covariance models of spatial variability. For the model parameters, we employ a parameterized covariance model calibrated on parameter observations; for the model states, the covariance is estimated from a number of forward simulations of the PDE model corresponding to realizations of the parameters drawn from their KLE. We apply the proposed approach to identifying heterogeneous log-diffusion coefficients in diffusion equations from spatially sparse measurements of the log-diffusion coefficient and the solution of the diffusion equation. We find that the proposed approach compares favorably against state-of-the-art point estimates such as maximum a posteriori estimation and physics-informed neural networks.
The Exoplanet Imaging Data Challenge is a community-wide effort meant to offer a platform for a fair and common comparison of image processing methods designed for exoplanet direct detection. For this purpose, it gathers on a dedicated repository (Ze nodo), data from several high-contrast ground-based instruments worldwide in which we injected synthetic planetary signals. The data challenge is hosted on the CodaLab competition platform, where participants can upload their results. The specifications of the data challenge are published on our website. The first phase, launched on the 1st of September 2019 and closed on the 1st of October 2020, consisted in detecting point sources in two types of common data-set in the field of high-contrast imaging: data taken in pupil-tracking mode at one wavelength (subchallenge 1, also referred to as ADI) and multispectral data taken in pupil-tracking mode (subchallenge 2, also referred to as ADI mSDI). In this paper, we describe the approach, organisational lessons-learnt and current limitations of the data challenge, as well as preliminary results of the participants submissions for this first phase. In the future, we plan to provide permanent access to the standard library of data sets and metrics, in order to guide the validation and support the publications of innovative image processing algorithms dedicated to high-contrast imaging of planetary systems.
535 - R.L. Akeson , X. Chen , D. Ciardi 2013
We describe the contents and functionality of the NASA Exoplanet Archive, a database and tool set funded by NASA to support astronomers in the exoplanet community. The current content of the database includes interactive tables containing properties of all published exoplanets, Kepler planet candidates, threshold-crossing events, data validation reports and target stellar parameters, light curves from the Kepler and CoRoT missions and from several ground-based surveys, and spectra and radial velocity measurements from the literature. Tools provided to work with these data include a transit ephemeris predictor, both for single planets and for observing locations, light curve viewing and normalization utilities, and a periodogram and phased light curve service. The archive can be accessed at http://exoplanetarchive.ipac.caltech.edu.
In Spring 2013, the LEECH (LBTI Exozodi Exoplanet Common Hunt) survey began its $sim$130-night campaign from the Large Binocular Telescope (LBT) atop Mt Graham, Arizona. This survey benefits from the many technological achievements of the LBT, includ ing two 8.4-meter mirrors on a single fixed mount, dual adaptive secondary mirrors for high Strehl performance, and a cold beam combiner to dramatically reduce the telescopes overall background emissivity. LEECH neatly complements other high-contrast planet imaging efforts by observing stars at L (3.8 $mu$m), as opposed to the shorter wavelength near-infrared bands (1-2.4 $mu$m) of other surveys. This portion of the spectrum offers deep mass sensitivity, especially around nearby adolescent ($sim$0.1-1 Gyr) stars. LEECHs contrast is competitive with other extreme adaptive optics systems, while providing an alternative survey strategy. Additionally, LEECH is characterizing known exoplanetary systems with observations from 3-5$mu$m in preparation for JWST.
exoplanet is a toolkit for probabilistic modeling of astronomical time series data, with a focus on observations of exoplanets, using PyMC3 (Salvatier et al., 2016). PyMC3 is a flexible and high-performance model-building language and inference engin e that scales well to problems with a large number of parameters. exoplanet extends PyMC3s modeling language to support many of the custom functions and probability distributions required when fitting exoplanet datasets or other astronomical time series. While it has been used for other applications, such as the study of stellar variability, the primary purpose of exoplanet is the characterization of exoplanets or multiple star systems using time-series photometry, astrometry, and/or radial velocity. In particular, the typical use case would be to use one or more of these datasets to place constraints on the physical and orbital parameters of the system, such as planet mass or orbital period, while simultaneously taking into account the effects of stellar variability.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا