No Arabic abstract
We describe a new metric that uses machine learning to determine if a periodic signal found in a photometric time series appears to be shaped like the signature of a transiting exoplanet. This metric uses dimensionality reduction and k-nearest neighbors to determine whether a given signal is sufficiently similar to known transits in the same data set. This metric is being used by the Kepler Robovetter to determine which signals should be part of the Q1-Q17 DR24 catalog of planetary candidates. The Kepler Mission reports roughly 20,000 potential transiting signals with each run of its pipeline, yet only a few thousand appear sufficiently transit shaped to be part of the catalog. The other signals tend to be variable stars and instrumental noise. With this metric we are able to remove more than 90% of the non-transiting signals while retaining more than 99% of the known planet candidates. When tested with injected transits, less than 1% are lost. This metric will enable the Kepler mission and future missions looking for transiting planets to rapidly and consistently find the best planetary candidates for follow-up and cataloging.
Deep learning techniques have been well explored in the transiting exoplanet field, however previous work mainly focuses on classification and inspection. In this work, we develop a novel detection algorithm based on a well-proven object detection framework in the computer vision field. Through training the network on the light curves of the confirmed Kepler exoplanets, our model yields 94% precision and 95% recall for transits with signal-to-noise ratio higher than 6 (set the confidence threshold to 0.6). Giving a slightly lower confidence threshold, recall can reach higher than 97%, which makes our model applicable for large-scale search. We also transfer the trained model to the TESS data and obtain similar performance. The results of our algorithm match the intuition of the human visual perception and make it easy to find single transiting candidates. Moreover, the parameters of the output bounding boxes can also help to find multiplanet systems. Our network and detection functions are implemented in the Deep-Transit toolkit, which is an open-source Python package hosted on GitHub and PyPI.
The Transiting Exoplanet Survey Satellite (TESS) has now been operational for a little over two years, covering the Northern and the Southern hemispheres once. The TESS team processes the downlinked data using the Science Processing Operations Center pipeline and Quick Look pipeline to generate alerts for follow-up. Combined with other efforts from the community, over two thousand planet candidates have been found of which tens have been confirmed as planets. We present our pipeline, Nigraha, that is complementary to these approaches. Nigraha uses a combination of transit finding, supervised machine learning, and detailed vetting to identify with high confidence a few planet candidates that were missed by prior searches. In particular, we identify high signal to noise ratio (SNR) shallow transits that may represent more Earth-like planets. In the spirit of open data exploration we provide details of our pipeline, release our supervised machine learning model and code as open source, and make public the 38 candidates we have found in seven sectors. The model can easily be run on other sectors as is. As part of future work we outline ways to increase the yield by strengthening some of the steps where we have been conservative and discarded objects for lack of a datum or two.
The photometric light curves of BRITE satellites were examined through a machine learning technique to investigate whether there are possible exoplanets moving around nearby bright stars. Focusing on different transit periods, several convolutional neural networks were constructed to search for transit candidates. The convolutional neural networks were trained with synthetic transit signals combined with BRITE light curves until the accuracy rate was higher than 99.7 $%$. Our method could efficiently lead to a small number of possible transit candidates. Among these ten candidates, two of them, HD37465, and HD186882 systems, were followed up through future observations with a higher priority. The codes of convolutional neural networks employed in this study are publicly available at http://www.phys.nthu.edu.tw/$sim$jiang/BRITE2020YehJiangCNN.tar.gz.
Since the start of the Wide Angle Search for Planets (WASP) program, more than 160 transiting exoplanets have been discovered in the WASP data. In the past, possible transit-like events identified by the WASP pipeline have been vetted by human inspection to eliminate false alarms and obvious false positives. The goal of the present paper is to assess the effectiveness of machine learning as a fast, automated, and reliable means of performing the same functions on ground-based wide-field transit-survey data without human intervention. To this end, we have created training and test datasets made up of stellar light curves showing a variety of signal types including planetary transits, eclipsing binaries, variable stars, and non-periodic signals. We use a combination of machine learning methods including Random Forest Classifiers (RFCs) and Convolutional Neural Networks (CNNs) to distinguish between the different types of signals. The final algorithms correctly identify planets in the test data ~90% of the time, although each method on its own has a significant fraction of false positives. We find that in practice, a combination of different methods offers the best approach to identifying the most promising exoplanet transit candidates in data from WASP, and by extension similar transit surveys.
We present a novel, iterative method using an empirical Bayesian approach for modeling the limb darkened WASP-121b transit from the TESS light curve. Our method is motivated by the need to improve $R_{p}/R_{ast}$ estimates for exoplanet atmosphere modeling, and is particularly effective with the limb darkening (LD) quadratic law requiring no prior central value from stellar atmospheric models. With the non-linear LD law, the method has all the advantages of not needing atmospheric models but does not converge. The iterative method gives a different $R_{p}/R_{ast}$ for WASP-121b at a significance level of 1$sigma$ when compared with existing non-iterative methods. To assess the origins and implications of this difference, we generate and analyze light curves with known values of the limb darkening coefficients (LDCs). We find that non-iterative modeling with LDC priors from stellar atmospheric models results in an inconsistent $R_{p}/R_{ast}$ at 1.5$sigma$ level when the known LDC values are as those previously found when modeling real data by the iterative method. In contrast, the LDC values from the iterative modeling yields the correct value of $R_{p}/R_{ast}$ to within 0.25$sigma$. For more general cases with different known inputs, Monte Carlo simulations show that the iterative method obtains unbiased LDCs and correct $R_{p}/R_{ast}$ to within a significance level of 0.3$sigma$. Biased LDC priors can cause biased LDC posteriors and lead to bias in the $R_{p}/R_{ast}$ of up to 0.82$%$, 2.5$sigma$ for the quadratic law and 0.32$%$, 1.0$sigma$ for the non-linear law. Our improvement in $R_{p}/R_{ast}$ estimation is important when analyzing exoplanet atmospheres.