ترغب بنشر مسار تعليمي؟ اضغط هنا

Adaptive optics with programmable Fourier-based wavefront sensors: a spatial light modulator approach to the LOOPS testbed

102   0   0.0 ( 0 )
 نشر من قبل Pierre Janin-Potiron
 تاريخ النشر 2019
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

Wavefront sensors encode phase information of an incoming wavefront into an intensity pattern that can be measured on a camera. Several kinds of wavefront sensors (WFS) are used in astronomical adaptive optics. Amongst them, Fourier-based wavefront sensors perform a filtering operation on the wavefront in the focal plane. The most well known example of a WFS of this kind is the Zernike wavefront sensor, and the pyramid wavefront sensor (PWFS) also belongs to this class. Based on this same principle, new WFSs can be proposed such as the n-faced pyramid (which ultimately becomes an axicone) or the flattened pyramid, depending on whether the image formation is incoherent or coherent. In order to test such novel concepts, the LOOPS adaptive optics testbed hosted at the Laboratoire dAstrophysique de Marseille has been upgraded by adding a Spatial Light Modulator (SLM). This device, placed in a focal plane produces high-definition phase masks that mimic otherwise bulk optic devices. In this paper, we first present the optical design and upgrades made to the experimental setup of the LOOPS bench. Then, we focus on the generation of the phase masks with the SLM and the implications of having such a device in a focal plane. Finally, we present the first closed-loop results in either static or dynamic mode with different WFS applied on the SLM.

قيم البحث

اقرأ أيضاً

Fourier-based wavefront sensors, such as the Pyramid Wavefront Sensor (PWFS), are the current preference for high contrast imaging due to their high sensitivity. However, these wavefront sensors have intrinsic nonlinearities that constrain the range where conventional linear reconstruction methods can be used to accurately estimate the incoming wavefront aberrations. We propose to use Convolutional Neural Networks (CNNs) for the nonlinear reconstruction of the wavefront sensor measurements. It is demonstrated that a CNN can be used to accurately reconstruct the nonlinearities in both simulations and a lab implementation. We show that solely using a CNN for the reconstruction leads to suboptimal closed loop performance under simulated atmospheric turbulence. However, it is demonstrated that using a CNN to estimate the nonlinear error term on top of a linear model results in an improved effective dynamic range of a simulated adaptive optics system. The larger effective dynamic range results in a higher Strehl ratio under conditions where the nonlinear error is relevant. This will allow the current and future generation of large astronomical telescopes to work in a wider range of atmospheric conditions and therefore reduce costly downtime of such facilities.
We present our investigation into the impact of wavefront errors on high accuracy astrometry using Fourier Optics. MICADO, the upcoming near-IR imaging instrument for the Extremely Large Telescope, will offer capabilities for relative astrometry with an accuracy of 50 micro arcseconds ({mu}as). Due to the large size of the point spread function (PSF) compared to the astrometric requirement, the detailed shape and position of the PSF on the detector must be well understood. Furthermore, because the atmospheric dispersion corrector of MICADO is a moving component within an otherwise mostly static instrument, it might not be sufficient to perform a simple pre-observation calibration. Therefore, we have built a Fourier Optics framework, allowing us to evaluate the small changes in the centroid position of the PSF as a function of wavefront error. For a complete evaluation, we model both the low order surface form errors, using Zernike polynomials, and the mid- and high-spatial frequencies, using Power Spectral Density analysis. The described work will then make it possible, performing full diffractive beam propagation, to assess the expected astrometric performance of MICADO.
We present a concept of a millimeter wavefront sensor that allows real-time sensing of the surface of a ground-based millimeter/submillimeter telescope. It is becoming important for ground-based millimeter/submillimeter astronomy to make telescopes l arger with keeping their surface accurate. To establish `millimetric adaptive optics (MAO) that instantaneously corrects the wavefront degradation induced by deformation of telescope optics, our wavefront sensor based on radio interferometry measures changes in excess path lengths from characteristic positions on the primary mirror surface to the focal plane. This plays a fundamental role in planned 50-m class submillimeter telescopes such as LST and AtLAST.
Recent advances in adaptive optics (AO) have led to the implementation of wide field-of-view AO systems. A number of wide-field AO systems are also planned for the forthcoming Extremely Large Telescopes. Such systems have multiple wavefront sensors o f different types, and usually multiple deformable mirrors (DMs). Here, we report on our experience integrating cameras and DMs with the real-time control systems of two wide-field AO systems. These are CANARY, which has been operating on-sky since 2010, and DRAGON, which is a laboratory adaptive optics real-time demonstrator instrument. We detail the issues and difficulties that arose, along with the solutions we developed. We also provide recommendations for consideration when developing future wide-field AO systems.
Increasing interest in astronomical applications of non-linear curvature wavefront sensors for turbulence detection and correction makes it important to understand how best to handle the data they produce, particularly at low light levels. Algorithms for wavefront phase-retrieval from a four-plane curvature wavefront sensor are developed and compared, with a view to their use for low order phase compensation in instruments combining adaptive optics and Lucky Imaging. The convergence speed and quality of iterative algorithms is compared to their step-size and techniques for phase retrieval at low photon counts are explored. Computer simulations show that at low light levels, preprocessing by convolution of the measured signal with a gaussian function can reduce by an order of magnitude the photon flux required for accurate phase retrieval of low-order errors. This facilitates wavefront correction on large telescopes with very faint reference stars.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا