ترغب بنشر مسار تعليمي؟ اضغط هنا

GPUs for data processing in the MWA

122   0   0.0 ( 0 )
 نشر من قبل Stephen Ord
 تاريخ النشر 2009
  مجال البحث فيزياء
والبحث باللغة English
 تأليف S. Ord




اسأل ChatGPT حول البحث

The MWA is a next-generation radio interferometer under construction in remote Western Australia. The data rate from the correlator makes storing the raw data infeasible, so the data must be processed in real-time. The processing task is of order ~10 TFLOPS. The remote location of the MWA limits the power that can be allocated to computing. We describe the design and implementation of elements of the MWA real-time data processing system which leverage the computing abilities of modern graphics processing units (GPUs). The matrix algebra and texture mapping capabilities of GPUs are well suited to the majority of tasks involved in real-time calibration and imaging. Considerable performance advantages over a conventional CPU-based reference implementation are obtained.



قيم البحث

اقرأ أيضاً

The Murchison Widefield Array is a low-frequency Square Kilometre Array precursor located at the Murchison Radio-astronomy Observatory in Western Australia. Primarily designed as an imaging telescope, but with a flexible signal path, the capabilities of this telescope have recently been extended to include off-line incoherent and tied-array beam formation using recorded antenna voltages. This has provided the capability for high-time and frequency resolution observations, including a pulsar science program. This paper describes the algorithms and pipeline that we have developed to form the tied array beam products from the summation of calibrated signals of the antenna elements, and presents example polarimetric profiles for PSRs J0437-4715 and J1900-2600 at 185 MHz.
Processing of raw data from modern astronomical instruments is nowadays often carried out using dedicated software, so-called pipelines which are largely run in automated operation. In this paper we describe the data reduction pipeline of the Multi U nit Spectroscopic Explorer (MUSE) integral field spectrograph operated at ESOs Paranal observatory. This spectrograph is a complex machine: it records data of 1152 separate spatial elements on detectors in its 24 integral field units. Efficiently handling such data requires sophisticated software, a high degree of automation and parallelization. We describe the algorithms of all processing steps that operate on calibrations and science data in detail, and explain how the raw science data gets transformed into calibrated datacubes. We finally check the quality of selected procedures and output data products, and demonstrate that the pipeline provides datacubes ready for scientific analysis.
{Context}. The HIFI instrument on the Herschel Space Observatory performed over 9100 astronomical observations, almost 900 of which were calibration observations in the course of the nearly four-year Herschel mission. The data from each observation h ad to be converted from raw telemetry into calibrated products and were included in the Herschel Science Archive. {Aims}. The HIFI pipeline was designed to provide robust conversion from raw telemetry into calibrated data throughout all phases of the HIFI missions. Pre-launch laboratory testing was supported as were routine mission operations. {Methods}. A modular software design allowed components to be easily added, removed, amended and/or extended as the understanding of the HIFI data developed during and after mission operations. {Results}. The HIFI pipeline processed data from all HIFI observing modes within the Herschel automated processing environment as well as within an interactive environment. The same software can be used by the general astronomical community to reprocess any standard HIFI observation. The pipeline also recorded the consistency of processing results and provided automated quality reports. Many pipeline modules were in use since the HIFI pre-launch instrument level testing. {Conclusions}. Processing in steps facilitated data analysis to discover and address instrument artefacts and uncertainties. The availability of the same pipeline components from pre-launch throughout the mission made for well-understood, tested, and stable processing. A smooth transition from one phase to the next significantly enhanced processing reliability and robustness.
121 - Shifan Zuo , Jixia Li , Yichao Li 2020
The Tianlai project is a 21cm intensity mapping experiment aimed at detecting dark energy by measuring the baryon acoustic oscillation (BAO) features in the large scale structure power spectrum. This experiment provides an opportunity to test the dat a processing methods for cosmological 21cm signal extraction, which is still a great challenge in current radio astronomy research. The 21cm signal is much weaker than the foregrounds and easily affected by the imperfections in the instrumental responses. Furthermore, processing the large volumes of interferometer data poses a practical challenge. We have developed a data processing pipeline software called {tt tlpipe} to process the drift scan survey data from the Tianlai experiment. It performs offline data processing tasks such as radio frequency interference (RFI) flagging, array calibration, binning, and map-making, etc. It also includes utility functions needed for the data analysis, such as data selection, transformation, visualization and others. A number of new algorithms are implemented, for example the eigenvector decomposition method for array calibration and the Tikhonov regularization for $m$-mode analysis. In this paper we describe the design and implementation of the {tt tlpipe} and illustrate its functions with some analysis of real data. Finally, we outline directions for future development of this publicly code.
Euclid is a Europe-led cosmology space mission dedicated to a visible and near infrared survey of the entire extra-galactic sky. Its purpose is to deepen our knowledge of the dark content of our Universe. After an overview of the Euclid mission and s cience, this contribution describes how the community is getting organized to face the data analysis challenges, both in software development and in operational data processing matters. It ends with a more specific account of some of the main contributions of the Swiss Science Data Center (SDC-CH).
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا