ترغب بنشر مسار تعليمي؟ اضغط هنا

Pipe3D, a pipeline to analyze Integral Field Spectroscopy data: I. New fitting phylosophy of FIT3D

93   0   0.0 ( 0 )
 نشر من قبل Sebastian F. Sanchez
 تاريخ النشر 2015
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

We present an improved version of FIT3D, a fitting tool for the analysis of the spectroscopic properties of the stellar populations and the ionized gas derived from moderate resolution spectra of galaxies. FIT3D is a tool developed to analyze Integral Field Spectroscopy data and it is the basis of Pipe3D, a pipeline already used in the analysis of datasets like CALIFA, MaNGA, and SAMI. We describe the philosophy behind the fitting procedure, and in detail each of the different steps in the analysis. We present an extensive set of simulations in order to estimate the precision and accuracy of the derived parameters for the stellar populations. In summary, we find that using different stellar population templates we reproduce the mean properties of the stellar population (age, metallicity, and dust attenuation) within ~0.1 dex. A similar approach is adopted for the ionized gas, where a set of simulated emission- line systems was created. Finally, we compare the results of the analysis using FIT3D with those provided by other widely used packages for the analysis of the stellar population (Starlight, Steckmap, and analysis based on stellar indices) using real high S/N data. In general we find that the parameters for the stellar populations derived by FIT3D are fully compatible with those derived using these other tools.



قيم البحث

اقرأ أيضاً

152 - Neil Zimmerman 2011
Project 1640 is a high contrast near-infrared instrument probing the vicinities of nearby stars through the unique combination of an integral field spectrograph with a Lyot coronagraph and a high-order adaptive optics system. The extraordinary data r eduction demands, similar those which several new exoplanet imaging instruments will face in the near future, have been met by the novel software algorithms described herein. The Project 1640 Data Cube Extraction Pipeline (PCXP) automates the translation of 3.8*10^4 closely packed, coarsely sampled spectra to a data cube. We implement a robust empirical model of the spectrograph focal plane geometry to register the detector image at sub-pixel precision, and map the cube extraction. We demonstrate our ability to accurately retrieve source spectra based on an observation of Saturns moon Titan.
SITELLE is a novel integral field unit spectroscopy instrument that has an impressive spatial (11 by 11 arcmin), spectral coverage, and spectral resolution (R=1-20000). SIGNALS is anticipated to obtain deep observations (down to 3.6x10-17ergs s-1cm-2 ) of 40 galaxies, each needing complex and substantial time to extract spectral information. We present a method that uses Convolution Neural Networks (CNN) for estimating emission line parameters in optical spectra obtained with SITELLE as part of the SIGNALS large program. Our algorithm is trained and tested on synthetic data representing typical emission spectra for HII regions based on Mexican Million Models database(3MdB) BOND simulations. The networks activation map demonstrates its ability to extract the dynamical (broadening and velocity) parameters from a set of 5 emission lines (e.g. H{alpha}, N[II] doublet, and S[II] doublet) in the SN3 (651-685 nm) filter of SITELLE. Once trained, the algorithm was tested on real SITELLE observations in the SIGNALS program of one of the South West fields of M33. The CNN recovers the dynamical parameters with an accuracy better than 5 km s-1 in regions with a signal-to-noise ratio greater than 15 over the H{alpha}line. More importantly, our CNN method reduces calculation time by over an order of magnitude on the spectral cube with native spatial resolution when compared with standard fitting procedures. These results clearly illustrate the power of machine learning algorithms for the use in future IFU-based missions. Subsequent work will explore the applicability of the methodology to other spectral parameters such as the flux of key emission lines.
We present the data reduction pipeline for CHARIS, a high-contrast integral-field spectrograph for the Subaru Telescope. The pipeline constructs a ramp from the raw reads using the measured nonlinear pixel response, and reconstructs the data cube usi ng one of three extraction algorithms: aperture photometry, optimal extraction, or $chi^2$ fitting. We measure and apply both a detector flatfield and a lenslet flatfield and reconstruct the wavelength- and position-dependent lenslet point-spread function (PSF) from images taken with a tunable laser. We use these measured PSFs to implement a $chi^2$-based extraction of the data cube, with typical residuals of ~5% due to imperfect models of the undersampled lenslet PSFs. The full two-dimensional residual of the $chi^2$ extraction allows us to model and remove correlated read noise, dramatically improving CHARIS performance. The $chi^2$ extraction produces a data cube that has been deconvolved with the line-spread function, and never performs any interpolations of either the data or the individual lenslet spectra. The extracted data cube also includes uncertainties for each spatial and spectral measurement. CHARIS software is parallelized, written in Python and Cython, and freely available on github with a separate documentation page. Astrometric and spectrophotometric calibrations of the data cubes and PSF subtraction will be treated in a forthcoming paper.
We present a new technique to fit color-magnitude diagrams of open clusters based on the Cross-Entropy global optimization algorithm. The method uses theoretical isochrones available in the literature and maximizes a weighted likelihood function base d on distances measured in the color-magnitude space. The weights are obtained through a non parametric technique that takes into account the star distance to the observed center of the cluster, observed magnitude uncertainties, the stellar density profile of the cluster among others. The parameters determined simultaneously are distance, reddening, age and metallicity. The method takes binary fraction into account and uses a Monte-Carlo approach to obtain uncertainties on the determined parameters for the cluster by running the fitting algorithm many times with a re-sampled data set through a bootstrapping procedure. We present results for 9 well studied open clusters, based on 15 distinct data sets, and show that the results are consistent with previous studies. The method is shown to be reliable and free of the subjectivity of most previous visual isochrone fitting techniques.
A fully autonomous data reduction pipeline has been developed for FRODOSpec, an optical fibre-fed integral field spectrograph currently in use at the Liverpool Telescope. This paper details the process required for the reduction of data taken using a n integral field spectrograph and presents an overview of the computational methods implemented to create the pipeline. Analysis of errors and possible future enhancements are also discussed.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا