ترغب بنشر مسار تعليمي؟ اضغط هنا

Considerations for optimizing photometric classification of supernovae from the Rubin Observatory

214   0   0.0 ( 0 )
 نشر من قبل Catarina Sampaio Alves
 تاريخ النشر 2021
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Survey telescopes such as the Vera C. Rubin Observatory will increase the number of observed supernovae (SNe) by an order of magnitude, discovering millions of events; however, it is impossible to spectroscopically confirm the class for all the SNe discovered. Thus, photometric classification is crucial but its accuracy depends on the not-yet-finalized observing strategy of Rubin Observatorys Legacy Survey of Space and Time (LSST). We quantitatively analyze the impact of the LSST observing strategy on SNe classification using the simulated multi-band light curves from the Photometric LSST Astronomical Time-Series Classification Challenge (PLAsTiCC). First, we augment the simulated training set to be representative of the photometric redshift distribution per supernovae class, the cadence of observations, and the flux uncertainty distribution of the test set. Then we build a classifier using the photometric transient classification library snmachine, based on wavelet features obtained from Gaussian process fits, yielding similar performance to the winning PLAsTiCC entry. We study the classification performance for SNe with different properties within a single simulated observing strategy. We find that season length is an important factor, with light curves of 150 days yielding the highest classification performance. Cadence is also crucial for SNe classification; events with median inter-night gap of <3.5 days yield higher performance. Interestingly, we find that large gaps (>10 days) in light curve observations does not impact classification performance as long as sufficient observations are available on either side, due to the effectiveness of the Gaussian process interpolation. This analysis is the first exploration of the impact of observing strategy on photometric supernova classification with LSST.

قيم البحث

اقرأ أيضاً

The commissioning team for the Vera C. Rubin observatory is planning a set of engineering and science verification observations with the Legacy Survey of Space and Time (LSST) commissioning camera and then the Rubin Observatory LSST Camera. The time frame for these observations is not yet fixed, and the commissioning team will have flexibility in selecting fields to observe. In this document, the Dark Energy Science Collaboration (DESC) Commissioning Working Group presents a prioritized list of target fields appropriate for testing various aspects of DESC-relevant science performance, grouped by season for visibility from Rubin Observatory at Cerro Pachon. Our recommended fields include Deep-Drilling fields (DDFs) to full LSST depth for photo-$z$ and shape calibration purposes, HST imaging fields to full depth for deblending studies, and an $sim$200 square degree area to 1-year depth in several filters for higher-level validation of wide-area science cases for DESC. We also anticipate that commissioning observations will be needed for template building for transient science over a broad RA range. We include detailed descriptions of our recommended fields along with associated references. We are optimistic that this document will continue to be useful during LSST operations, as it provides a comprehensive list of overlapping data-sets and the references describing them.
We report a framework for spectroscopic follow-up design for optimizing supernova photometric classification. The strategy accounts for the unavoidable mismatch between spectroscopic and photometric samples, and can be used even in the beginning of a new survey -- without any initial training set. The framework falls under the umbrella of active learning (AL), a class of algorithms that aims to minimize labelling costs by identifying a few, carefully chosen, objects which have high potential in improving the classifier predictions. As a proof of concept, we use the simulated data released after the Supernova Photometric Classification Challenge (SNPCC) and a random forest classifier. Our results show that, using only 12% the number of training objects in the SNPCC spectroscopic sample, this approach is able to double purity results. Moreover, in order to take into account multiple spectroscopic observations in the same night, we propose a semi-supervised batch-mode AL algorithm which selects a set of $N=5$ most informative objects at each night. In comparison with the initial state using the traditional approach, our method achieves 2.3 times higher purity and comparable figure of merit results after only 180 days of observation, or 800 queries (73% of the SNPCC spectroscopic sample size). Such results were obtained using the same amount of spectroscopic time necessary to observe the original SNPCC spectroscopic sample, showing that this type of strategy is feasible with current available spectroscopic resources. The code used in this work is available in the COINtoolbox: https://github.com/COINtoolbox/ActSNClass .
Cosmology with Type Ia supernovae heretofore has required extensive spectroscopic follow-up to establish a redshift. Though tolerable at the present discovery rate, the next generation of ground-based all-sky survey instruments will render this appro ach unsustainable. Photometry-based redshift determination is a viable alternative, but introduces non-negligible errors that ultimately degrade the ability to discriminate between competing cosmologies. We present a strictly template-based photometric redshift estimator and compute redshift reconstruction errors in the presence of photometry and statistical errors. With reasonable assumptions for a cadence and supernovae distribution, these redshift errors are combined with systematic errors and propagated using the Fisher matrix formalism to derive lower bounds on the joint errors in $Omega_w$ and $Omega_w$ relevant to the next generation of ground-based all-sky survey.
Current and future optical and near-infrared wide-field surveys have the potential of finding kilonovae, the optical and infrared counterparts to neutron star mergers, independently of gravitational-wave or high-energy gamma-ray burst triggers. The a bility to discover fast and faint transients such as kilonovae largely depends on the area observed, the depth of those observations, the number of re-visits per field in a given time frame, and the filters adopted by the survey; it also depends on the ability to perform rapid follow-up observations to confirm the nature of the transients. In this work, we assess kilonova detectability in existing simulations of the LSST strategy for the Vera C. Rubin Wide Fast Deep survey, with focus on comparing rolling to baseline cadences. Although currently available cadences can enable the detection of more than 300 kilonovae out to 1400 Mpc over the ten-year survey, we can expect only 3-32 kilonovae similar to GW170817 to be recognizable as fast-evolving transients. We also explore the detectability of kilonovae over the plausible parameter space, focusing on viewing angle and ejecta masses. We find that observations in redder izy bands are crucial for identification of nearby (within 300 Mpc) kilonovae that could be spectroscopically classified more easily than more distant sources. Rubins potential for serendipitous kilonova discovery could be increased by gain of efficiency with the employment of individual 30s exposures (as opposed to 2x15s snap pairs), with the addition of red-band observations coupled with same-night observations in g- or r-bands, and possibly with further development of a new rolling-cadence strategy.
Automated photometric supernova classification has become an active area of research in recent years in light of current and upcoming imaging surveys such as the Dark Energy Survey (DES) and the Large Synoptic Survey Telescope, given that spectroscop ic confirmation of type for all supernovae discovered will be impossible. Here, we develop a multi-faceted classification pipeline, combining existing and new approaches. Our pipeline consists of two stages: extracting descriptive features from the light curves and classification using a machine learning algorithm. Our feature extraction methods vary from model-dependent techniques, namely SALT2 fits, to more independent techniques fitting parametric models to curves, to a completely model-independent wavelet approach. We cover a range of representative machine learning algorithms, including naive Bayes, k-nearest neighbors, support vector machines, artificial neural networks and boosted decision trees (BDTs). We test the pipeline on simulated multi-band DES light curves from the Supernova Photometric Classification Challenge. Using the commonly used area under the curve (AUC) of the Receiver Operating Characteristic as a metric, we find that the SALT2 fits and the wavelet approach, with the BDTs algorithm, each achieves an AUC of 0.98, where 1 represents perfect classification. We find that a representative training set is essential for good classification, whatever the feature set or algorithm, with implications for spectroscopic follow-up. Importantly, we find that by using either the SALT2 or the wavelet feature sets with a BDT algorithm, accurate classification is possible purely from light curve data, without the need for any redshift information.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا