Do you want to publish a course? Click here

The human pipeline: distributed data reduction for ALMA

137   0   0.0 ( 0 )
 Added by Scott Schnee
 Publication date 2014
  fields Physics
and research's language is English




Ask ChatGPT about the research

Users of the Atacama Large Millimeter/submillimeter Array (ALMA) are provided with calibration and imaging products in addition to raw data. In Cycle 0 and Cycle 1, these products are produced by a team of data reduction experts spread across Chile, East Asia, Europe, and North America. This article discusses the lines of communication between the data reducers and ALMA users that enable this model of distributed data reduction. This article also discusses the calibration and imaging scripts that have been provided to ALMA users in Cycles 0 and 1, and what will be different in future Cycles.



rate research

Read More

The SOXS is a dual-arm spectrograph (UV-VIS & NIR) and AC due to mounted on the ESO 3.6m NTT in La Silla. Designed to simultaneously cover the optical and NIR wavelength range from 350-2050 nm, the instrument will be dedicated to the study of transient and variable events with many Target of Opportunity requests expected. The goal of the SOXS Data Reduction pipeline is to use calibration data to remove all instrument signatures from the SOXS scientific data frames for each of the supported instrument modes, convert this data into physical units and deliver them with their associated error bars to the ESO SAF as Phase 3 compliant science data products, all within 30 minutes. The primary reduced product will be a detrended, wavelength and flux calibrated, telluric corrected 1D spectrum with UV-VIS + NIR arms stitched together. The pipeline will also generate QC metrics to monitor telescope, instrument and detector health. The pipeline is written in Python 3 and has been built with an agile development philosophy that includes adaptive planning and evolutionary development. The pipeline is to be used by the SOXS consortium and the general user community that may want to perform tailored processing of SOXS data. Test driven development has been used throughout the build using `extreme mock data. We aim for the pipeline to be easy to install and extensively and clearly documented.
125 - D. N. Friedel 2013
The Combined Array for Millimeter-wave Astronomy (CARMA) data reduction pipeline (CADRE) has been developed to give investigators a first look at a fully reduced set of their data. It runs automatically on all data produced by the telescope as they arrive in the CARMA data archive. CADRE is written in Python and uses Python wrappers for MIRIAD subroutines for direct access to the data. It goes through the typical reduction procedures for radio telescope array data and produces a set of continuum and spectral line maps in both MIRIAD and FITS format. CADRE has been in production for nearly two years and this paper presents the current capabilities and planned development.
254 - Megan Argo 2015
Written in Python and utilising ParselTongue to interface with the Astronomical Image Processing System (AIPS), the e-MERLIN data reduction pipeline is intended to automate the procedures required in processing and calibrating radio astronomy data from the e-MERLIN correlator. Driven by a plain text file of input parameters, the pipeline is modular and can be run in stages by the user, depending on requirements. The software includes options to load raw data, average in time and/or frequency, flag known sources of interference, flag more comprehensively with SERPent, carry out some or all of the calibration procedures including self-calibration), and image in either normal or wide-field mode. It also optionally produces a number of useful diagnostic plots at various stages so that the quality of the data can be assessed. The software is available for download from the e-MERLIN website or via Github.
The Array Camera for Optical to Near-IR Spectrophotometry, or ARCONS, is a camera based on Microwave Kinetic Inductance Detectors (MKIDs), a new technology that has the potential for broad application in astronomy. Using an array of MKIDs, the instrument is able to produce time-resolved imaging and low-resolution spectroscopy constructed from detections of individual photons. The arrival time and energy of each photon are recorded in a manner similar to X-ray calorimetry, but at higher photon fluxes. The technique works over a very large wavelength range, is free from fundamental read noise and dark-current limitations, and provides microsecond-level timing resolution. Since the instrument reads out all pixels continuously while exposing, there is no loss of active exposure time to readout. The technology requires a different approach to data reduction compared to conventional CCDs. We outline here the prototype data reduction pipeline developed for ARCONS, though many of the principles are also more broadly applicable to energy-resolved photon counting arrays (e.g., transition edge sensors, superconducting tunnel junctions). We describe the pipelines current status, and the algorithms and techniques employed in taking data from the arrival of photons at the MKID array to the production of images, spectra, and time-resolved light curves.
We present the data reduction pipeline for the Hi-GAL survey. Hi-GAL is a key project of the Herschel satellite which is mapping the inner part of the Galactic plane (|l| <= 70cdot and |b| <= 1cdot), using 2 PACS and 3 SPIRE frequency bands, from 70{mu}m to 500{mu}m. Our pipeline relies only partially on the Herschel Interactive Standard Environment (HIPE) and features several newly developed routines to perform data reduction, including accurate data culling, noise estimation and minimum variance map-making, the latter performed with the ROMAGAL algorithm, a deep modification of the ROMA code already tested on cosmological surveys. We discuss in depth the properties of the Hi-GAL Science Demonstration Phase (SDP) data.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا