ترغب بنشر مسار تعليمي؟ اضغط هنا

The MACHO Data Pipeline

101   0   0.0 ( 0 )
 نشر من قبل Tim Axelrod
 تاريخ النشر 1995
  مجال البحث فيزياء
والبحث باللغة English
 تأليف T. S. Axelrod




اسأل ChatGPT حول البحث

The MACHO experiment is searching for dark matter in the halo of the Galaxy by monitoring more than 20 million stars in the LMC and Galactic bulge for gravitational microlensing events. The hardware consists of a 50 inch telescope, a two-color 32 megapixel ccd camera, and a network of computers. On clear nights the system generates up to 8 GB of raw data and 1 GB of reduced data. The computer system is responsible for all realtime control tasks, for data reduction, and for storing all data associated with each observation in a data base. The subject of this paper is the software system that handles these functions. It is an integrated system controlled by Petri nets that consists of multiple processes communicating via mailboxes and a bulletin board. The system is highly automated, readily extensible, and incorporates flexible error recovery capabilities. It is implemented with C++ in a Unix environment.

قيم البحث

اقرأ أيضاً

47 - D. P. Bennett 1993
MAssive Compact Halo Objects such as brown dwarfs, Jupiters, and black holes are prime candidates to comprise the dark halo of our galaxy. Paczynski noted that objects (dubbed MACHOs) with masses in the range $10^{-6}M_odot < M simlt 100 M_odot$. can be detected via gravitational microlensing of stars in the Magellanic Clouds with the caveat that only about one in $10^6$ stars will be lensed at any given time. Our group has recently begun a search for microlensing using a refurbished 1.27 meter telescope at the Mount Stromlo Observatory in Australia. Since the summer of 1992, we have been imaging up to $10^7$ stars a night in the Large Magellanic Cloud using our large format two-color $3.4times 10^7$ pixel CCD camera. Here I report on our first results based on an analysis of $sim 10^6$ of these stars. Although this is not enough data to make definitive statements about the nature of the dark matter, we are able to conclude that the rate of variable star background events is not larger than the expected MACHO signal.
The SOXS is a dual-arm spectrograph (UV-VIS & NIR) and AC due to mounted on the ESO 3.6m NTT in La Silla. Designed to simultaneously cover the optical and NIR wavelength range from 350-2050 nm, the instrument will be dedicated to the study of transie nt and variable events with many Target of Opportunity requests expected. The goal of the SOXS Data Reduction pipeline is to use calibration data to remove all instrument signatures from the SOXS scientific data frames for each of the supported instrument modes, convert this data into physical units and deliver them with their associated error bars to the ESO SAF as Phase 3 compliant science data products, all within 30 minutes. The primary reduced product will be a detrended, wavelength and flux calibrated, telluric corrected 1D spectrum with UV-VIS + NIR arms stitched together. The pipeline will also generate QC metrics to monitor telescope, instrument and detector health. The pipeline is written in Python 3 and has been built with an agile development philosophy that includes adaptive planning and evolutionary development. The pipeline is to be used by the SOXS consortium and the general user community that may want to perform tailored processing of SOXS data. Test driven development has been used throughout the build using `extreme mock data. We aim for the pipeline to be easy to install and extensively and clearly documented.
Transient radio phenomena and pulsars are one of six LOFAR Key Science Projects (KSPs). As part of the Transients KSP, the Pulsar Working Group (PWG) has been developing the LOFAR Pulsar Data Pipelines to both study known pulsars as well as search fo r new ones. The pipelines are being developed for the Blue Gene/P (BG/P) supercomputer and a large Linux cluster in order to utilize enormous amounts of computational capabilities (50Tflops) to process data streams of up to 23TB/hour. The LOFAR pipeline output will be using the Hierarchical Data Format 5 (HDF5) to efficiently store large amounts of numerical data, and to manage complex data encompassing a variety of data types, across distributed storage and processing architectures. We present the LOFAR Known Pulsar Data Pipeline overview, the pulsar beam-formed data format, the status of the pipeline processing as well as our future plans for developing the LOFAR Pulsar Search Pipeline. These LOFAR pipelines and software tools are being developed as the next generation toolset for pulsar processing in Radio Astronomy.
75 - D. N. Friedel 2013
The Combined Array for Millimeter-wave Astronomy (CARMA) data reduction pipeline (CADRE) has been developed to give investigators a first look at a fully reduced set of their data. It runs automatically on all data produced by the telescope as they a rrive in the CARMA data archive. CADRE is written in Python and uses Python wrappers for MIRIAD subroutines for direct access to the data. It goes through the typical reduction procedures for radio telescope array data and produces a set of continuum and spectral line maps in both MIRIAD and FITS format. CADRE has been in production for nearly two years and this paper presents the current capabilities and planned development.
156 - Megan Argo 2015
Written in Python and utilising ParselTongue to interface with the Astronomical Image Processing System (AIPS), the e-MERLIN data reduction pipeline is intended to automate the procedures required in processing and calibrating radio astronomy data fr om the e-MERLIN correlator. Driven by a plain text file of input parameters, the pipeline is modular and can be run in stages by the user, depending on requirements. The software includes options to load raw data, average in time and/or frequency, flag known sources of interference, flag more comprehensively with SERPent, carry out some or all of the calibration procedures including self-calibration), and image in either normal or wide-field mode. It also optionally produces a number of useful diagnostic plots at various stages so that the quality of the data can be assessed. The software is available for download from the e-MERLIN website or via Github.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا