ترغب بنشر مسار تعليمي؟ اضغط هنا

Overview of the Kepler Science Processing Pipeline

87   0   0.0 ( 0 )
 نشر من قبل Jon Jenkins
 تاريخ النشر 2010
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

The Kepler Mission Science Operations Center (SOC) performs several critical functions including managing the ~156,000 target stars, associated target tables, science data compression tables and parameters, as well as processing the raw photometric data downlinked from the spacecraft each month. The raw data are first calibrated at the pixel level to correct for bias, smear induced by a shutterless readout, and other detector and electronic effects. A background sky flux is estimated from ~4500 pixels on each of the 84 CCD readout channels, and simple aperture photometry is performed on an optimal aperture for each star. Ancillary engineering data and diagnostic information extracted from the science data are used to remove systematic errors in the flux time series that are correlated with these data prior to searching for signatures of transiting planets with a wavelet-based, adaptive matched filter. Stars with signatures exceeding 7.1 sigma are subjected to a suite of statistical tests including an examination of each stars centroid motion to reject false positives caused by background eclipsing binaries. Physical parameters for each planetary candidate are fitted to the transit signature, and signatures of additional transiting planets are sought in the residual light curve. The pipeline is operational, finding planetary signatures and providing robust eliminations of false positives.

قيم البحث

اقرأ أيضاً

Keplers primary mission is a search for earth-size exoplanets in the habitable zone of late-type stars using the transit method. To effectively accomplish this mission, Kepler orbits the Sun and stares nearly continuously at one field-of-view which w as carefully selected to provide an appropriate density of target stars. The data transmission rates, operational cycles, and target management requirements implied by this mission design have been optimized and integrated into a comprehensive plan for science operations. The commissioning phase completed all critical tasks and accomplished all objectives within a week of the pre-launch plan. Since starting science, the nominal data collection timeline has been interrupted by two safemode events, several losses of fine point, and some small pointing adjustments. The most important anomalies are understood and mitigated, so Keplers technical performance metrics have improved significantly over this period and the prognosis for mission success is excellent. The Kepler data archive is established and hosting data for the science team, guest observers, and public. The first data sets to become publicly available include the monthly full-frame images, dropped targets, and individual sources as they are published. Data are released through the archive on a quarterly basis; the Kepler Results Catalog will be released annually starting in 2011.
Numerous telescopes and techniques have been used to find and study extrasolar planets, but none has been more successful than NASAs Kepler Space Telescope. Kepler has discovered the majority of known exoplanets, the smallest planets to orbit normal stars, and the worlds most likely to be similar to our home planet. Most importantly, Kepler has provided our first look at typical characteristics of planets and planetary systems for planets with sizes as small as and orbits as large as those of the Earth.
The Stratospheric Observatory for Infrared Astronomy (SOFIA) is an airborne astronomical observatory comprised of a 2.5-meter telescope mounted in the aft section of a Boeing 747SP aircraft. During routine operations, several instruments will be avai lable to the astronomical community including cameras and spectrographs in the near- to far-IR. Raw data obtained in-flight require a significant amount of processing to correct for background emission (from both the telescope and atmosphere), remove instrumental artifacts, correct for atmospheric absorption, and apply both wavelength and flux calibration. In general, this processing is highly specific to the instrument and telescope. In order to maximize the scientific output of the observatory, the SOFIA Science Center must provide these post-processed data sets to Guest Investigators in a timely manner. To meet this requirement, we have designed and built the SOFIA Data Processing System (DPS): an in-house set of tools and services that can be used in both automatic (pipeline) and manual modes to process data from a variety of instruments. Here we present an overview of the DPS concepts and architecture, as well as operational results from the first two SOFIA observing cycles (2013--2014).
In this work we empirically measure the detection efficiency of Kepler pipeline used to create the final Kepler Threshold Crossing Event (TCE; Twicken et al. 2016) and planet candidate catalogs (Thompson et al. 2018), a necessary ingredient for occur rence rate calculations using these lists. By injecting simulated signals into the calibrated pixel data and processing those pixels through the pipeline as normal, we quantify the detection probability of signals as a function of their signal strength and orbital period. In addition we investigate the dependence of the detection efficiency on parameters of the target stars and their location in the Kepler field of view. We find that the end-of-mission version of the Kepler pipeline returns to a high overall detection efficiency, averaging a 90-95% rate of detection for strong signals across a wide variety of parameter space. We find a weak dependence of the detection efficiency on the number of transits contributing to the signal and the orbital period of the signal, and a stronger dependence on the stellar effective temperature and correlated noise properties. We also find a weak dependence of the detection efficiency on the position within the field of view. By restricting the Kepler stellar sample to stars with well-behaved correlated noise properties, we can define a set of stars with high detection efficiency for future occurrence rate calculations.
The Dark Energy Survey (DES) is a five-year optical imaging campaign with the goal of understanding the origin of cosmic acceleration. DES performs a 5000 square degree survey of the southern sky in five optical bands (g,r,i,z,Y) to a depth of ~24th magnitude. Contemporaneously, DES performs a deep, time-domain survey in four optical bands (g,r,i,z) over 27 square degrees. DES exposures are processed nightly with an evolving data reduction pipeline and evaluated for image quality to determine if they need to be retaken. Difference imaging and transient source detection are also performed in the time domain component nightly. On a bi-annual basis, DES exposures are reprocessed with a refined pipeline and coadded to maximize imaging depth. Here we describe the DES image processing pipeline in support of DES science, as a reference for users of archival DES data, and as a guide for future astronomical surveys.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا