Do you want to publish a course? Click here

A novel LIDAR-based Atmospheric Calibration Method for Improving the Data Analysis of MAGIC

119   0   0.0 ( 0 )
 Added by Christian Fruck
 Publication date 2014
  fields Physics
and research's language is English




Ask ChatGPT about the research

A new method for analyzing the returns of the custom-made micro-LIDAR system, which is operated along with the two MAGIC telescopes, allows to apply atmospheric corrections in the MAGIC data analysis chain. Such corrections make it possible to extend the effective observation time of MAGIC under adverse atmospheric conditions and reduce the systematic errors of energy and flux in the data analysis. LIDAR provides a range-resolved atmospheric backscatter profile from which the extinction of Cherenkov light from air shower events can be estimated. Knowledge of the extinction can allow to reconstruct the true image parameters, including energy and flux. Our final goal is to recover the source-intrinsic energy spectrum also for data affected by atmospheric extinction from aerosol layers, such as clouds.



rate research

Read More

67 - M. Janssen , C. Goddi , H. Falcke 2019
Currently, HOPS and AIPS are the primary choices for the time-consuming process of (millimeter) Very Long Baseline Interferometry (VLBI) data calibration. However, for a full end-to-end pipeline, they either lack the ability to perform easily scriptable incremental calibration or do not provide full control over the workflow with the ability to manipulate and edit calibration solutions directly. The Common Astronomy Software Application (CASA) offers all these abilities, together with a secure development future and an intuitive Python interface, which is very attractive for young radio astronomers. Inspired by the recent addition of a global fringe-fitter, the capability to convert FITS-IDI files to measurement sets, and amplitude calibration routines based on ANTAB metadata, we have developed the the CASA-based Radboud PIpeline for the Calibration of high Angular Resolution Data (rPICARD). The pipeline will be able to handle data from multiple arrays: EHT, GMVA, VLBA and the EVN in the first release. Polarization and phase-referencing calibration are supported and a spectral line mode will be added in the future. The large bandwidths of future radio observatories ask for a scalable reduction software. Within CASA, a message passing interface (MPI) implementation is used for parallelization, reducing the total time needed for processing. The most significant gain is obtained for the time-consuming fringe-fitting task where each scan be processed in parallel.
64 - Y. Ohtani , A. Berti , D. Depaoli 2021
The Cherenkov Telescope Array (CTA) will be the next generation gamma-ray observatory, which will consist of three kinds of telescopes of different sizes. Among those, the Large Size Telescope (LST) will be the most sensitive in the low energy range starting from 20 GeV. The prototype LST (LST-1) proposed for CTA was inaugurated in October 2018 in the northern hemisphere site, La Palma (Spain), and is currently in its commissioning phase. MAGIC is a system of two gamma-ray Cherenkov telescopes of the current generation, located approximately 100 m away from LST-1, that have been operating in stereoscopic mode since 2009. Since LST-1 and MAGIC can observe the same air shower events, we can compare the brightness of showers, estimated energies of gamma rays, and other parameters event by event, which can be used to cross-calibrate the telescopes. Ultimately, by performing combined analyses of the events triggering the three telescopes, we can reconstruct the shower geometry more accurately, leading to better energy and angular resolutions, and a better discrimination of the background showers initiated by cosmic rays. For that purpose, as part of the commissioning of LST-1, we performed joint observations of established gamma-ray sources with LST-1 and MAGIC. Also, we have developed Monte Carlo simulations for such joint observations and an analysis pipeline which finds event coincidence in the offline analysis based on their timestamps. In this work, we present the first detection of an astronomical source, the Crab Nebula, with combined observation of LST-1 and MAGIC. Moreover, we show results of the inter-telescope cross-calibration obtained using Crab Nebula data taken during joint observations with LST-1 and MAGIC.
Precise instrumental calibration is of crucial importance to 21-cm cosmology experiments. The Murchison Widefield Arrays (MWA) Phase II compact configuration offers us opportunities for both redundant calibration and sky-based calibration algorithms; using the two in tandem is a potential approach to mitigate calibration errors caused by inaccurate sky models. The MWA Epoch of Reionization (EoR) experiment targets three patches of the sky (dubbed EoR0, EoR1, and EoR2) with deep observations. Previous work in cite{Li_2018} and cite{Wenyang_2019} studied the effect of tandem calibration on the EoR0 field and found that it yielded no significant improvement in the power spectrum over sky-based calibration alone. In this work, we apply similar techniques to the EoR1 field and find a distinct result: the improvements in the power spectrum from tandem calibration are significant. To understand this result, we analyze both the calibration solutions themselves and the effects on the power spectrum over three nights of EoR1 observations. We conclude that the presence of the bright radio galaxy Fornax A in EoR1 degrades the performance of sky-based calibration, which in turn enables redundant calibration to have a larger impact. These results suggest that redundant calibration can indeed mitigate some level of model-incompleteness error.
Response calibration is the process of inferring how much the measured data depend on the signal one is interested in. It is essential for any quantitative signal estimation on the basis of the data. Here, we investigate self-calibration methods for linear signal measurements and linear dependence of the response on the calibration parameters. The common practice is to augment an external calibration solution using a known reference signal with an internal calibration on the unknown measurement signal itself. Contemporary self-calibration schemes try to find a self-consistent solution for signal and calibration by exploiting redundancies in the measurements. This can be understood in terms of maximizing the joint probability of signal and calibration. However, the full uncertainty structure of this joint probability around its maximum is thereby not taken into account by these schemes. Therefore better schemes -- in sense of minimal square error -- can be designed by accounting for asymmetries in the uncertainty of signal and calibration. We argue that at least a systematic correction of the common self-calibration scheme should be applied in many measurement situations in order to properly treat uncertainties of the signal on which one calibrates. Otherwise the calibration solutions suffer from a systematic bias, which consequently distorts the signal reconstruction. Furthermore, we argue that non-parametric, signal-to-noise filtered calibration should provide more accurate reconstructions than the common bin averages and provide a new, improved self-calibration scheme. We illustrate our findings with a simplistic numerical example.
Redundant information of low-bit-rate speech is extremely small, thus its very difficult to implement large capacity steganography on the low-bit-rate speech. Based on multiple vector quantization characteristics of the Line Spectrum Pair (LSP) of the speech codec, this paper proposes a steganography scheme using a 3D-Magic matrix to enlarge capacity and improve quality of speech. A cyclically moving algorithm to construct a 3D-Magic matrix for steganography is proposed in this paper, as well as an embedding and an extracting algorithm of steganography based on the 3D-Magic matrix in low-bit-rate speech codec. Theoretical analysis is provided to demonstrate that the concealment and the hidden capacity are greatly improved with the proposed scheme. Experimental results show the hidden capacity is raised to 200bps in ITU-T G.723.1 codec. Moreover, the quality of steganography speech in Perceptual Evaluation of Speech Quality (PESQ) reduces no more than 4%, indicating a little impact on the quality of speech. In addition, the proposed hidden scheme could prevent being detected by some steganalysis tools effectively.
comments
Fetching comments Fetching comments
Sign in to be able to follow your search criteria
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا