ترغب بنشر مسار تعليمي؟ اضغط هنا

Validation of PSF Models for HST and Other Space-Based Observations

47   0   0.0 ( 0 )
 نشر من قبل Bryan Gillis
 تاريخ النشر 2020
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Forthcoming space-based observations will require high-quality point-spread function (PSF) models for weak gravitational lensing measurements. One approach to generating these models is using a wavefront model based on the known telescope optics. We present an empirical framework for validating such models to confirm that they match the actual PSF to within requirements by comparing the models to the observed light distributions of isolated stars. We apply this framework to Tiny Tim, the standard tool for generating model PSFs for the Hubble Space Telescope (HST), testing its models against images taken by HSTs Advanced Camera for Surveys in the Wide Field Channel. We show that Tiny Tims models, in the default configuration, differ significantly from the observed PSFs, most notably in their sizes. We find that the quality of Tiny Tim PSFs can be improved through fitting the full set of Zernike polynomial coefficients which characterise the optics, to the point where the practical significance of the difference between model and observed PSFs is negligible for most use cases, resulting in additive and multiplicative biases both of order approximately 4e-4. We also show that most of this improvement can be retained through using an updated set of Zernike coefficients, which we provide.

قيم البحث

اقرأ أيضاً

192 - K. Tayabaly , D. Spiga , G. Sironi 2016
The Point Spread Function (PSF) is a key figure of merit for specifying the angular resolution of optical systems and, as the demand for higher and higher angular resolution increases, the problem of surface finishing must be taken seriously even in optical telescopes. From the optical design of the instrument, reliable ray-tracing routines allow computing and display of the PSF based on geometrical optics. However, such an approach does not directly account for the scattering caused by surface microroughness, which is interferential in nature. Although the scattering effect can be separately modeled, its inclusion in the ray-tracing routine requires assumptions that are difficult to verify. In that context, a purely physical optics approach is more appropriate as it remains valid regardless of the shape and size of the defects appearing on the optical surface. Such a computation, when performed in two-dimensional consideration, is memory and time consuming because it requires one to process a surface map with a few micron resolution, and the situation becomes even more complicated in case of optical systems characterized by more than one reflection. Fortunately, the computation is significantly simplified in far-field configuration, since the computation involves only a sequence of Fourier Transforms. In this paper, we provide validation of the PSF simulation with Physical Optics approach through comparison with real PSF measurement data in the case of ASTRI-SST M1 hexagonal segments. These results represent a first foundation stone for future development in a more advanced computation taking into account microroughness and multiple reflection in optical systems.
Telescope and detector developments continuously enable deeper and more detailed studies of astronomical objects. Larger collecting areas, improvement in dispersion and detector techniques, and higher sensitivities allow detection of more molecules i n a single observation, at lower abundances, resulting in better constraints of the targets physical and chemical conditions. Improvements on current telescopes, and not to mention future observatories, both in space and on the ground, will continue this trend, ever improving our understanding of the Universe. Planetary exploration missions carry instrumentation to unexplored areas, and reveal details impossible to observe from the Earth by performing in-situ measurements. Space based observatories allow observations of object at wavelength ranges absorbed by the Earths atmosphere. The depth of understanding from all of these studies can be greatly enhanced by combining observations: ground-based and space-based, low-resolution and high-resolution, local and global-scale, similar observations over a broader or different spectra range, or by providing temporal information through follow-ups. Combined observations provide context and a broader scope of the studied object, and in this white paper, we outline a number of studies where observations are synergistically applied to increase the scientific value of both datasets. Examples include atmospheric studies of Venus, Mars, Titan, comets, Jupiter, as well as more specific cases describing synergistic studies in the Juno mission, and ground-based radar studies for near Earth objects. The examples aim to serve as inspiration for future synergistic observations, and recommendations are made based on the lessons learned from these examples.
We present the implementation and use of algorithms for matching point-spread functions (PSFs) within the Pan-STARRS Image Processing Pipeline (IPP). PSF-matching is an essential part of the IPP for the detection of supernovae and asteroids, but it i s also used to homogenize the PSF of inputs to stacks, resulting in improved photometric precision compared to regular coaddition, especially in data with a high masked fraction. We report our experience in constructing and operating the image subtraction pipeline, and make recommendations about particular basis functions for constructing the PSF-matching convolution kernel, determining a suitable kernel, parallelisation and quality metrics. We introduce a method for reliably tracking the noise in an image throughout the pipeline, using the combination of a variance map and a `covariance pseudo-matrix. We demonstrate these algorithms with examples from both simulations and actual data from the Pan-STARRS1 telescope.
We investigate the impact of point spread function (PSF) fitting errors on cosmic shear measurements using the concepts of complexity and sparsity. Complexity, introduced in a previous paper, characterizes the number of degrees of freedom of the PSF. For instance, fitting an underlying PSF with a model with low complexity will lead to small statistical errors on the model parameters, however these parameters could suffer from large biases. Alternatively, fitting with a large number of parameters will tend to reduce biases at the expense of statistical errors. We perform an optimisation of scatters and biases by studying the mean squared error of a PSF model. We also characterize a model sparsity, which describes how efficiently the model is able to represent the underlying PSF using a limited number of free parameters. We present the general case and illustrate it for a realistic example of PSF fitted with shapelet basis sets. We derive the relation between complexity and sparsity of the PSF model, signal-to-noise ratio of stars and systematic errors on cosmological parameters. With the constraint of maintaining the systematics below the statistical uncertainties, this lead to a relation between the required number of stars to calibrate the PSF and the sparsity. We discuss the impact of our results for current and future cosmic shear surveys. In the typical case where the biases can be represented as a power law of the complexity, we show that current weak lensing surveys can calibrate the PSF with few stars, while future surveys will require hard constraints on the sparsity in order to calibrate the PSF with 50 stars.
Direct imaging of Earth-like planets from space requires dedicated observatories, combining large segmented apertures with instruments and techniques such as coronagraphs, wavefront sensors, and wavefront control in order to reach the high contrast o f 10^10 that is required. The complexity of these systems would be increased by the segmentation of the primary mirror, which allows for the larger diameters necessary to image Earth-like planets but also introduces specific patterns in the image due to the pupil shape and segmentation and making high-contrast imaging more challenging. Among these defects, the phasing errors of the primary mirror are a strong limitation to the performance. In this paper, we focus on the wavefront sensing of segment phasing errors for a high-contrast system, using the COronagraphic Focal plane wave-Front Estimation for Exoplanet detection (COFFEE) technique. We implemented and tested COFFEE on the High-contrast imaging for Complex Aperture Telescopes (HiCAT) testbed, in a configuration without any coronagraph and with a classical Lyot coronagraph, to reconstruct errors applied on a 37 segment mirror. We analysed the quality and limitations of the reconstructions. We demonstrate that COFFEE is able to estimate correctly the phasing errors of a segmented telescope for piston, tip, and tilt aberrations of typically 100nm RMS. We also identified the limitations of COFFEE for the reconstruction of low-order wavefront modes, which are highly filtered by the coronagraph. This is illustrated using two focal plane mask sizes on HiCAT. We discuss possible solutions, both in the hardware system and in the COFFEE optimizer, to mitigate these issues.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا