No Arabic abstract
(Abbreviated) We show that the XRT spectral response calibration was complicated by various energy offsets in photon counting (PC) and windowed timing (WT) modes related to the way the CCD is operated in orbit (variation in temperature during observations, contamination by optical light from the sunlit Earth and increase in charge transfer inefficiency). We describe how these effects can be corrected for in the ground processing software. We show that the low-energy response, the redistribution in spectra of absorbed sources, and the modelling of the line profile have been significantly improved since launch by introducing empirical corrections in our code when it was not possible to use a physical description. We note that the increase in CTI became noticeable in June 2006 (i.e. 14 months after launch), but the evidence of a more serious degradation in spectroscopic performance (line broadening and change in the low-energy response) due to large charge traps (i.e. faults in the Si crystal) became more significant after March 2007. We describe efforts to handle such changes in the spectral response. Finally, we show that the commanded increase in the substrate voltage from 0 to 6V on 2007 August 30 reduced the dark current, enabling the collection of useful science data at higher CCD temperature (up to -50C). We also briefly describe the plan to recalibrate the XRT response files at this new voltage.
The Swift X-ray Telescope (XRT) focal plane camera is a front-illuminated MOS CCD, providing a spectral response kernel of 144 eV FWHM at 6.5 keV. We describe the CCD calibration program based on celestial and on-board calibration sources, relevant in-flight experiences, and developments in the CCD response model. We illustrate how the revised response model describes the calibration sources well. Loss of temperature control motivated a laboratory program to re-optimize the CCD substrate voltage, we describe the small changes in the CCD response that would result from use of a substrate voltage of 6V.
The Swift X-ray Telescope focal plane camera is a front-illuminated MOS CCD, providing a spectral response kernel of 135 eV FWHM at 5.9 keV as measured before launch. We describe the CCD calibration program based on celestial and on-board calibration sources, relevant in-flight experiences, and developments in the CCD response model. We illustrate how the revised response model describes the calibration sources well. Comparison of observed spectra with models folded through the instrument response produces negative residuals around and below the Oxygen edge. We discuss several possible causes for such residuals. Traps created by proton damage on the CCD increase the charge transfer inefficiency (CTI) over time. We describe the evolution of the CTI since the launch and its effect on the CCD spectral resolution and the gain.
We present the first results of the in-flight validation of the spectral response of the FREGATE X/gamma detectors on-board the HETE-2 satellite. This validation uses the Crab pulsar and nebula as reference spectra.
The X-ray telescope on board the Swift satellite for gamma-ray burst astronomy has been exposed to the radiation of the space environment since launch in November 2004. Radiation causes damage to the detector, with the generation of dark current and charge trapping sites that result in the degradation of the spectral resolution and an increase of the instrumental background. The Swift team has a dedicated calibration program with the goal of recovering a significant proportion of the lost spectroscopic performance. Calibration observations of supernova remnants with strong emission lines are analysed to map the detector charge traps and to derive position-dependent corrections to the measured photon energies. We have achieved a substantial recovery in the XRT resolution by implementing these corrections in an updated version of the Swift XRT gain file and in corresponding improvements to the Swift XRT HEAsoft software. We provide illustrations of the impact of the enhanced energy resolution, and show that we have recovered most of the spectral resolution lost since launch.
We discuss the flight calibration of the spectral response of the Advanced CCD Imaging Spectrometer (ACIS) on-board the Chandra X-ray Observatory (CXO). The spectral resolution and sensitivity of the ACIS instrument have both been evolving over the course of the mission. The spectral resolution of the frontside-illuminated (FI) CCDs changed dramatically in the first month of the mission due to radiation damage. Since that time, the spectral resolution of the FI CCDs and the backside-illuminated (BI) CCDs have evolved gradually with time. We demonstrate the efficacy of charge-transfer inefficiency (CTI) correction algorithms which recover some of the lost performance. The detection efficiency of the ACIS instrument has been declining throughout the mission, presumably due to a layer of contamination building up on the filter and/or CCDs. We present a characterization of the energy dependence of the excess absorption and demonstrate software which models the time dependence of the absorption from energies of 0.4 keV and up. The spectral redistribution function and the detection efficiency are well-characterized at energies from 1.5 to 8.0 keV. The calibration at energies below 1.5 keV is challenging because of the lack of strong lines in the calibration source and also because of the inherent non-linear dependence with energy of the CTI and the absorption by the contamination layer. We have been using data from celestial sources with relatively simple spectra to determine the quality of the calibration below 1.5 keV. The analysis of these observations demonstrate that the CTI correction recovers a significant fraction of the spectral resolution of the FI CCDs and the models of the time-dependent absorption result in consistent measurements of the flux at low energies for data from a BI (S3) CCD.