No Arabic abstract
We present results from applying the SNAD anomaly detection pipeline to the third public data release of the Zwicky Transient Facility (ZTF DR3). The pipeline is composed of 3 stages: feature extraction, search of outliers with machine learning algorithms and anomaly identification with followup by human experts. Our analysis concentrates in three ZTF fields, comprising more than 2.25 million objects. A set of 4 automatic learning algorithms was used to identify 277 outliers, which were subsequently scrutinised by an expert. From these, 188 (68%) were found to be bogus light curves -- including effects from the image subtraction pipeline as well as overlapping between a star and a known asteroid, 66 (24%) were previously reported sources whereas 23 (8%) correspond to non-catalogued objects, with the two latter cases of potential scientific interest (e. g. 1 spectroscopically confirmed RS Canum Venaticorum star, 4 supernovae candidates, 1 red dwarf flare). Moreover, using results from the expert analysis, we were able to identify a simple bi-dimensional relation which can be used to aid filtering potentially bogus light curves in future studies. We provide a complete list of objects with potential scientific application so they can be further scrutinised by the community. These results confirm the importance of combining automatic machine learning algorithms with domain knowledge in the construction of recommendation systems for astronomy. Our code is publicly available at https://github.com/snad-space/zwad
The Zwicky Transient Facility (ZTF), a public-private enterprise, is a new time domain survey employing a dedicated camera on the Palomar 48-inch Schmidt telescope with a 47 deg$^2$ field of view and 8 second readout time. It is well positioned in the development of time domain astronomy, offering operations at 10% of the scale and style of the Large Synoptic Survey Telescope (LSST) with a single 1-m class survey telescope. The public surveys will cover the observable northern sky every three nights in g and r filters and the visible Galactic plane every night in g and r. Alerts generated by these surveys are sent in real time to brokers. A consortium of universities which provided funding (partnership) are undertaking several boutique surveys. The combination of these surveys producing one million alerts per night allows for exploration of transient and variable astrophysical phenomena brighter than r $sim$ 20.5 on timescales of minutes to years. We describe the primary science objectives driving ZTF including the physics of supernovae and relativistic explosions, multi-messenger astrophysics, supernova cosmology, active galactic nuclei and tidal disruption events, stellar variability, and Solar System objects.
The Zwicky Transient Facility (ZTF) Observing System (OS) is the data collector for the ZTF project to study astrophysical phenomena in the time domain. ZTF OS is based upon the 48-inch aperture Schmidt-type design Samuel Oschin Telescope at the Palomar Observatory in Southern California. It incorporates new telescope aspheric corrector optics, dome and telescope drives, a large-format exposure shutter, a flat-field illumination system, a robotic bandpass filter exchanger, and the key element: a new 47-square-degree, 600 megapixel cryogenic CCD mosaic science camera, along with supporting equipment. The OS collects and delivers digitized survey data to the ZTF Data System (DS). Here, we describe the ZTF OS design, optical implementation, delivered image quality, detector performance, and robotic survey efficiency.
The Zwicky Transient Facility (ZTF) survey generates real-time alerts for optical transients, variables, and moving objects discovered in its wide-field survey. We describe the ZTF alert stream distribution and processing (filtering) system. The system uses existing open-source technologies developed in industry: Kafka, a real-time streaming platform, and Avro, a binary serialization format. The technologies used in this system provide a number of advantages for the ZTF use case, including (1) built-in replication, scalability, and stream rewind for the distribution mechanism; (2) structured messages with strictly enforced schemas and dynamic typing for fast parsing; and (3) a Python-based stream processing interface that is similar to batch for a familiar and user-friendly plug-in filter system, all in a modular, primarily containerized system. The production deployment has successfully supported streaming up to 1.2 million alerts or roughly 70 GB of data per night, with each alert available to a consumer within about 10 s of alert candidate production. Data transfer rates of about 80,000 alerts/minute have been observed. In this paper, we discuss this alert distribution and processing system, the design motivations for the technology choices for the framework, performance in production, and how this system may be generally suitable for other alert stream use cases, including the upcoming Large Synoptic Survey Telescope.
We present a novel algorithm for scheduling the observations of time-domain imaging surveys. Our Integer Linear Programming approach optimizes an observing plan for an entire night by assigning targets to temporal blocks, enabling strict control of the number of exposures obtained per field and minimizing filter changes. A subsequent optimization step minimizes slew times between each observation. Our optimization metric self-consistently weights contributions from time-varying airmass, seeing, and sky brightness to maximize the transient discovery rate. We describe the implementation of this algorithm on the surveys of the Zwicky Transient Facility and present its on-sky performance.
The Zwicky Transient Facility is a new robotic-observing program, in which a newly engineered 600-MP digital camera with a pioneeringly large field of view, 47~square degrees, will be installed into the 48-inch Samuel Oschin Telescope at the Palomar Observatory. The camera will generate $sim 1$~petabyte of raw image data over three years of operations. In parallel related work, new hardware and software systems are being developed to process these data in real time and build a long-term archive for the processed products. The first public release of archived products is planned for early 2019, which will include processed images and astronomical-source catalogs of the northern sky in the $g$ and $r$ bands. Source catalogs based on two different methods will be generated for the archive: aperture photometry and point-spread-function fitting.