No Arabic abstract
We present a framework for the analysis of direct detection planet finding missions using space telescopes. This framework generates simulations of complete missions, with varying populations of planets, to produce ensembles of mission simulations, which are used to calculate distributions of mission science yields. We describe the components of a mission simulation, including the complete description of an arbitrary planetary system, the description of a planet finding instrument, and the modeling of a target system observation. These components are coupled with a decision modeling algorithm, which allows us to automatically generate mission timelines with simple mission rules that lead to an optimized science yield. Along with the details of our implementation of this algorithm, we discuss validation techniques and possible future refinements. We apply this analysis technique to four mission concepts whose common element is a 4m diameter telescope aperture: an internal pupil mapping coronagraph with two different inner working angles, an external occulter, and the THEIA XPC multiple distance occulter. The focus of this study is to determine the ability of each of these designs to achieve one of their most difficult mission goals - the detection and characterization of Earth-like planets in the habitable zone. We find that all four designs are capable of detecting on the order of 5 Earth-like planets within a 5 year mission, even if we assume that only 1 out of every 10 stars has such a planet. The designs do differ significantly in their ability to characterize the planets they find. Along with science yield, we also analyze fuel usage for the two occulter designs, and discuss the strengths and weaknesses of each of the mission concepts.
Like the miniaturization of modern computers, next-generation radial velocity instruments will be significantly smaller and more powerful than their predecessors.
In this short paper, we study the photometric precision of stellar light curves obtained by the CoRoT satellite in its planet finding channel, with a particular emphasis on the timescales characteristic of planetary transits. Together with other articles in the same issue of this journal, it forms an attempt to provide the building blocks for a statistical interpretation of the CoRoT planet and eclipsing binary catch to date. After pre-processing the light curves so as to minimise long-term variations and outliers, we measure the scatter of the light curves in the first three CoRoT runs lasting more than 1 month, using an iterative non-linear filter to isolate signal on the timescales of interest. The bevhaiour of the noise on 2h timescales is well-described a power-law with index 0.25 in R-magnitude, ranging from 0.1mmag at R=11.5 to 1mmag at R=16, which is close to the pre-launch specification, though still a factor 2-3 above the photon noise due to residual jitter noise and hot pixel events. There is evidence for a slight degradation of the performance over time. We find clear evidence for enhanced variability on hours timescales (at the level of 0.5 mmag) in stars identified as likely giants from their R-magnitude and B-V colour, which represent approximately 60 and 20% of the observed population in the direction of Aquila and Monoceros respectively. On the other hand, median correlated noise levels over 2h for dwarf stars are extremely low, reaching 0.05mmag at the bright end.
We present htof, an open-source tool for interpreting and fitting the intermediate astrometric data (IAD) from both the 1997 and 2007 reductions of Hipparcos, the scanning-law of Gaia, and future missions such as the Nancy Grace Roman Space Telescope (NGRST). htof solves for the astrometric parameters of any system for any arbitrary combination of absolute astrometric missions. In preparation for later Gaia data releases, htof supports arbitrarily high-order astrometric solutions (e.g. five-, seven-, nine-parameter fits). Using htof, we find that the IAD of 6617 sources in Hipparcos 2007 might have been affected by a data corruption issue. htof integrates an ad-hoc correction that reconciles the IAD of these sources with their published catalog solutions. We developed htof to study masses and orbital parameters of sub-stellar companions, and we outline its implementation in one orbit fitting code (orvara, https://github.com/t-brandt/orvara). We use htof to predict a range of hypothetical additional planets in the $beta$~Pic system, which could be detected by coupling NGRST astrometry with Gaia and Hipparcos. htof is pip installable and available at https://github.com/gmbrandt/htof .
This paper describes the new QuickFind method in LcTools for finding signals and associated TTVs (Transit Timing Variations) in light curves from NASA space missions. QuickFind is adept at finding medium to large sized signals (generally those with S/N ratios above 15) extremely fast, significantly reducing overall processing time for a light curve as compared to the BLS detection method. For example, on the lead authors computer, QuickFind was able to detect both KOI signals for star 10937029 in a 14 quarter Kepler light curve spanning 1,459 days in roughly 2 seconds whereas BLS took about 155 seconds to find both signals making QuickFind in this example about 77 times faster than BLS. This paper focuses on the user interfaces, data processing algorithm, and performance tests for the QuickFind method in LcTools.
Since 2009, the Kepler, K2, and TESS missions have produced a vast number of lightcurves for public use. To assist citizen scientists in processing those lightcurves, the LcTools software system was developed. The system provides a set of tools to efficiently search for signals of interest in large sets of lightcurves using automated and manual (visual) techniques. At the heart of the system is a multipurpose lightcurve viewer and signal processor with advanced navigation and display capabilities to facilitate the search for signals. Other applications in the system are available for building lightcurve files in bulk, finding periodic signals automatically, and generating signal reports. This paper describes each application in the system and the methods by which the software can be used to detect and record signals. The software is free and can be obtained from the lead author by request.