ترغب بنشر مسار تعليمي؟ اضغط هنا

Computation and validation of two-dimensional PSF simulation based on physical optics

193   0   0.0 ( 0 )
 نشر من قبل Daniele Spiga
 تاريخ النشر 2016
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

The Point Spread Function (PSF) is a key figure of merit for specifying the angular resolution of optical systems and, as the demand for higher and higher angular resolution increases, the problem of surface finishing must be taken seriously even in optical telescopes. From the optical design of the instrument, reliable ray-tracing routines allow computing and display of the PSF based on geometrical optics. However, such an approach does not directly account for the scattering caused by surface microroughness, which is interferential in nature. Although the scattering effect can be separately modeled, its inclusion in the ray-tracing routine requires assumptions that are difficult to verify. In that context, a purely physical optics approach is more appropriate as it remains valid regardless of the shape and size of the defects appearing on the optical surface. Such a computation, when performed in two-dimensional consideration, is memory and time consuming because it requires one to process a surface map with a few micron resolution, and the situation becomes even more complicated in case of optical systems characterized by more than one reflection. Fortunately, the computation is significantly simplified in far-field configuration, since the computation involves only a sequence of Fourier Transforms. In this paper, we provide validation of the PSF simulation with Physical Optics approach through comparison with real PSF measurement data in the case of ASTRI-SST M1 hexagonal segments. These results represent a first foundation stone for future development in a more advanced computation taking into account microroughness and multiple reflection in optical systems.

قيم البحث

اقرأ أيضاً

Forthcoming space-based observations will require high-quality point-spread function (PSF) models for weak gravitational lensing measurements. One approach to generating these models is using a wavefront model based on the known telescope optics. We present an empirical framework for validating such models to confirm that they match the actual PSF to within requirements by comparing the models to the observed light distributions of isolated stars. We apply this framework to Tiny Tim, the standard tool for generating model PSFs for the Hubble Space Telescope (HST), testing its models against images taken by HSTs Advanced Camera for Surveys in the Wide Field Channel. We show that Tiny Tims models, in the default configuration, differ significantly from the observed PSFs, most notably in their sizes. We find that the quality of Tiny Tim PSFs can be improved through fitting the full set of Zernike polynomial coefficients which characterise the optics, to the point where the practical significance of the difference between model and observed PSFs is negligible for most use cases, resulting in additive and multiplicative biases both of order approximately 4e-4. We also show that most of this improvement can be retained through using an updated set of Zernike coefficients, which we provide.
We present results of the point spread function (PSF) calibration of the hard X-ray optics of the Nuclear Spectroscopic Telescope Array (NuSTAR). Immediately post-launch, NuSTAR has observed bright point sources such as Cyg X-1, Vela X-1, and Her X-1 for the PSF calibration. We use the point source observations taken at several off-axis angles together with a ray-trace model to characterize the in-orbit angular response, and find that the ray-trace model alone does not fit the observed event distributions and applying empirical corrections to the ray-trace model improves the fit significantly. We describe the corrections applied to the ray-trace model and show that the uncertainties in the enclosed energy fraction (EEF) of the new PSF model is < 3% for extraction apertures of R > 60 with no significant energy dependence. We also show that the PSF of the NuSTAR optics has been stable over a period of ~300 days during its in-orbit operation.
70 - S. Hauf , M.G. Pia (2 2009
The anticipated high sensitivity and the science goals of the next generation X-ray space missions, like the International X-ray Observatory or Simbol-X, rely on a low instrumental background, which in turn requires optimized shielding concepts. We p resent Geant4 based simulation results on the IXO Wide Field Imager cosmic ray proton induced background in comparison with previous results obtained for the Simbol-X LED and HED focal plane detectors. Our results show that an improvement in mean differential background flux compared to actually operating X-ray observatories may be feasible with detectors based on DEPFET technology. In addition we present preliminary results concerning the validation of Geant4 based radioactive decay simulation in space applications as a part of the Nano5 project.
Here, we explore the combination of sub-wavelength, two-dimensional atomic arrays and Rydberg interactions as a powerful platform to realize strong, coherent interactions between individual photons with high fidelity. In particular, the spatial order ing of the atoms guarantees efficient atom-light interactions without the possibility of scattering light into unwanted directions, for example, allowing the array to act as a perfect mirror for individual photons. In turn, Rydberg interactions enable single photons to alter the optical response of the array within a potentially large blockade radius $R_b$, which can effectively punch a large hole for subsequent photons. We show that such a system enables a coherent photon-photon gate or switch, with an error scaling $sim R_b^{-4}$ that is significantly better than the best known scaling in a disordered ensemble. We also investigate the optical properties of the system in the limit of strong input intensities. Although this a priori represents a complicated, many-body quantum driven dissipative system, we find that the behavior can be captured well by a semi-classical model based on holes punched in a classical mirror.
Radioactive decays are of concern in a wide variety of applications using Monte-Carlo simulations. In order to properly estimate the quality of such simulations, knowledge of the accuracy of the decay simulation is required. We present a validation o f the original Geant4 Radioactive Decay Module, which uses a per-decay sampling approach, and of an extended package for Geant4-based simulation of radioactive decays, which, in addition to being able to use a refactored per-decay sampling, is capable of using a statistical sampling approach. The validation is based on measurements of calibration isotope sources using a high purity Germanium (HPGe) detector; no calibration of the simulation is performed. For the considered validation experiment equivalent simulation accuracy can be achieved with per-decay and statistical sampling.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا