Do you want to publish a course? Click here

Unmixing methods based on nonnegativity and weakly mixed pixels for astronomical hyperspectral datasets

59   0   0.0 ( 0 )
 Added by Olivier Berne
 Publication date 2020
  fields Physics
and research's language is English




Ask ChatGPT about the research

[Abridged] An increasing number of astronomical instruments (on Earth and space-based) provide hyperspectral images, that is three-dimensional data cubes with two spatial dimensions and one spectral dimension. The intrinsic limitation in spatial resolution of these instruments implies that the spectra associated with pixels of such images are most often mixtures of the spectra of the pure components that exist in the considered region. In order to estimate the spectra and spatial abundances of these pure components, we here propose an original blind signal separation (BSS), that is to say an unsupervised unmixing method. Our approach is based on extensions and combinations of linear BSS methods that belong to two major classes of methods, namely nonnegative matrix factorization (NMF) and Sparse Component Analysis (SCA). The former performs the decomposition of hyperspectral images, as a set of pure spectra and abundance maps, by using nonnegativity constraints, but the estimated solution is not unique: It highly depends on the initialization of the algorithm. The considered SCA methods are based on the assumption of the existence of points or tiny spatial zones where only one source is active (i.e., one pure component is present). In real conditions, the assumption of perfect single-source points or zones is not always realistic. In such conditions, SCA yields approximat



rate research

Read More

The direct detection of exoplanets with high-contrast instruments can be boosted with high spectral resolution. For integral field spectrographs yielding hyperspectral data, this means that the field of view consists of diffracted starlight spectra and a spatially localized planet. Analysis usually relies on cross-correlation with theoretical spectra. In a purely blind-search context, this supervised strategy can be biased with model mismatch and/or be computationally inefficient. Using an approach that is inspired by the remote-sensing community, we aim to propose an alternative to cross-correlation that is fully data-driven, which decomposes the data into a set of individual spectra and their corresponding spatial distributions. This strategy is called spectral unmixing. We used an orthogonal subspace projection to identify the most distinct spectra in the field of view. Their spatial distribution maps were then obtained by inverting the data. These spectra were then used to break the original hyperspectral images into their corresponding spatial distribution maps via non-negative least squares. The performance of our method was evaluated and compared with a cross-correlation using simulated hyperspectral data with medium resolution from the ELT/HARMONI integral field spectrograph. We show that spectral unmixing effectively leads to a planet detection solely based on spectral dissimilarities at significantly reduced computational cost. The extracted spectrum holds significant signatures of the planet while being not perfectly separated from residual starlight. The sensitivity of the supervised cross-correlation is three to four times higher than with unsupervised spectral unmixing, the gap is biased toward the former because the injected and correlated spectrum match perfectly. The algorithm was furthermore vetted on real data obtained with VLT/SINFONI of the beta Pictoris system.
Intelligent scheduling of the sequence of scientific exposures taken at ground-based astronomical observatories is massively challenging. Observing time is over-subscribed and atmospheric conditions are constantly changing. We propose to guide observatory scheduling using machine learning. Leveraging a 15-year archive of exposures, environmental, and operating conditions logged by the Canada-France-Hawaii Telescope, we construct a probabilistic data-driven model that accurately predicts image quality. We demonstrate that, by optimizing the opening and closing of twelve vents placed on the dome of the telescope, we can reduce dome-induced turbulence and improve telescope image quality by (0.05-0.2 arc-seconds). This translates to a reduction in exposure time (and hence cost) of $sim 10-15%$. Our study is the first step toward data-based optimization of the multi-million dollar operations of current and next-generation telescopes.
Among the many challenges posed by the huge data volumes produced by the new generation of astronomical instruments there is also the search for rare and peculiar objects. Unsupervised outlier detection algorithms may provide a viable solution. In this work we compare the performances of six methods: the Local Outlier Factor, Isolation Forest, k-means clustering, a measure of novelty, and both a normal and a convolutional autoencoder. These methods were applied to data extracted from SDSS stripe 82. After discussing the sensitivity of each method to its own set of hyperparameters, we combine the results from each method to rank the objects and produce a final list of outliers.
In this work we test the most widely used methods for fitting the composition fraction in data, namely maximum likelihood, $chi^2$, mean value of the distributions and mean value of the posterior probability function. We discuss the discrimination power of the four methods in different scenarios: signal to noise discrimination; two signals; and distributions of Xmax for mixed primary mass composition. We introduce a distance parameter, which can be used to estimate, as a rule of thumb, the precision of the discrimination. Finally, we conclude that the most reliable methods in all the studied scenarios are the maximum likelihood and the mean value of the posterior probability function.
The next technological breakthrough in millimeter-submillimeter astronomy is 3D imaging spectrometry with wide instantaneous spectral bandwidths and wide fields of view. The total optimization of the focal-plane instrument, the telescope, the observing strategy, and the signal-processing software must enable efficient removal of foreground emission from the Earths atmosphere, which is time-dependent and highly nonlinear in frequency. Here we present TiEMPO: Time-Dependent End-to-End Model for Post-process Optimization of the DESHIMA Spectrometer. TiEMPO utilizes a dynamical model of the atmosphere and parametrized models of the astronomical source, the telescope, the instrument, and the detector. The output of TiEMPO is a time-stream of sky brightness temperature and detected power, which can be analyzed by standard signal-processing software. We first compare TiEMPO simulations with an on-sky measurement by the wideband DESHIMA spectrometer and find good agreement in the noise power spectral density and sensitivity. We then use TiEMPO to simulate the detection of a line emission spectrum of a high-redshift galaxy using the DESHIMA 2.0 spectrometer in development. The TiEMPO model is open source. Its modular and parametrized design enables users to adapt it to design and optimize the end-to-end performance of spectroscopic and photometric instruments on existing and future telescopes.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا