No Arabic abstract
The Gamma-ray Large Area Space Telescope (GLAST) is an observatory designed to perform gamma-ray astronomy in the energy range 20 MeV to 300 GeV, with supporting measurements for gamma-ray bursts from 10 keV to 25 MeV. GLAST will be launched at the end of 2007, opening a new and important window on a wide variety of high energy astrophysical phenomena . The main instrument of GLAST is the Large Area Telescope (LAT), which provides break-through high-energy measurements using techniques typically used in particle detectors for collider experiments. The LAT consists of 16 identical towers in a four-by-four grid, each one containing a pair conversion tracker and a hodoscopic crystal calorimeter, all covered by a segmented plastic scintillator anti-coincidence shield. The scientific return of the instrument depends very much on how accurately we know its performance, and how well we can monitor it and correct potential problems promptly. We report on a novel technique that we are developing to help in the characterization and monitoring of LAT by using the power of classification trees to pinpoint in a short time potential problems in the recorded data. The same technique could also be used to evaluate the effect on the overall LAT performance produced by potential instrumental problems.
The INTEGRAL satellite was launched on October 17, 2002. All on-board instruments are operating successfully. In this paper, we focus on radiation effects on the Cadmium Telluride camera ISGRI. The spectral response of the camera is affected by cosmic particles depositing huge amount of energy, greater than the high threshold of the electronics. Our study raises the contribution of cosmic ray protons. Solutions are proposed to limit the degradation of spectral response of large pixel gamma cameras operating in space.
Data on board the future PLANCK Low Frequency Instrument (LFI), to measure the Cosmic Microwave Background (CMB) anisotropies, consist of $N$ differential temperature measurements, expanding a range of values we shall call $R$. Preliminary studies and telemetry allocation indicate the need of compressing these data by a ratio of $c_r simgt 10$. Here we present a study of entropy for (correlated multi-Gaussian discrete) noise, showing how the optimal compression $c_{r,opt}$, for a linearly discretized data set with $N_{bits}=log_2{N_{max}}$ bits is given by: $c_r simeq {N_{bits}/log_2(sqrt{2pi e} ~sigma_e/Delta)}$, where $sigma_eequiv (det C)^{1/2N}$ is some effective noise rms given by the covariance matrix $C$ and $Delta equiv R / N_{max}$ is the digital resolution. This $Delta$ only needs to be as small as the instrumental white noise RMS: $Delta simeq sigma_T simeq 2 mK$ (the nominal $mu K$ pixel sensitivity will only be achieved after averaging). Within the currently proposed $N_{bits}=16$ representation, a linear analogue to digital converter (ADC) will allow the digital storage of a large dynamic range of differential temperature $R= N_{max} Delta $ accounting for possible instrument drifts and instabilities (which could be reduced by proper on-board calibration). A well calibrated signal will be dominated by thermal (white) noise in the instrument: $sigma_e simeq sigma_T$, which could yield large compression rates $c_{r,opt} simeq 8$. This is the maximum lossless compression possible. In practice, point sources and $1/f$ noise will produce $sigma_e > sigma_T$ and $c_{r,opt} < 8$. This strategy seems safer than non-linear ADC or data reduction schemes (which could also be used at some stage).
The X and Gamma Imaging Spectrometer instrument on-board the THESEUS mission (selected by ESA in the framework of the Cosmic Vision M5 launch opportunity, currently in phase A) is based on a detection plane composed of several thousands of single active elements. Each element comprises a 4.5x4.5x30 mm 3 CsI(Tl) scintillator bar, optically coupled at both ends to Silicon Drift Detectors (SDDs). The SDDs acts both as photodetectors for the scintillation light and as direct X-ray sensors. In this paper the design of the XGIS detection plane is reviewed, outlining the strategic choices in terms of modularity and redundancy of the system. Results on detector-electronics prototypes are also described. Moreover, the design and development of the low-noise front-end electronics is presented, emphasizing the innovative architectural design based on custom-designed Application-Specific Integrated Circuits (ASICs).
The GLAST Large Area Telescope (LAT) is the next generation satellite experiment for high-energy gamma-ray astronomy. It is a pair conversion telescope built with a plastic anticoincidence shield, a segmented CsI electromagnetic calorimeter, and the largest silicon strip tracker ever built. It will cover the energy range from 30 MeV to 300 GeV, shedding light on many issues left open by its predecessor EGRET. One of the most exciting science topics is the detection and observation of gamma-ray bursts (GRBs). In this paper we present the work done so far by the GRB LAT science group in studying the performance of the LAT detector to observe GRBs. We report on the simulation framework developed by the group as well as on the science tools dedicated to GRBs data analysis. We present the LAT sensitivity to GRBs obtained with such simulations, and, finally, the general scheme of GRBs detection that will be adopted on orbit.
The Scientific objectives of the LOFT mission, e.g., the study of the Neutron Star equation of state and of the Strong Gravity, require accurate energy, time and flux calibration for the 500k channels of the SDD detectors, as well as the knowledge of the detector dead-time and of the detector response with respect to the incident angle of the photons. We report here the evaluations made to asses the calibration issues for the LAD instrument. The strategies for both ground and on-board calibrations, including astrophysical observations, show that the goals are achievable within the current technologies.