ترغب بنشر مسار تعليمي؟ اضغط هنا

A Data-driven Approach to X-ray Spectral Fitting: Quasi-Deconvolution

96   0   0.0 ( 0 )
 نشر من قبل Carter Rhea
 تاريخ النشر 2021
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

X-ray spectral fitting of astronomical sources requires convolving the intrinsic spectrum or model with the instrumental response. Standard forward modeling techniques have proven success in recovering the underlying physical parameters in moderate to high signal-to-noise regimes; however, they struggle to achieve the same level of accuracy in low signal-to-noise regimes. Additionally, the use of machine learning techniques on X-ray spectra requires access to the intrinsic spectrum. Therefore, the measured spectrum must be effectively deconvolved from the instrumental response. In this note, we explore numerical methods for inverting the matrix equation describing X-ray spectral convolution. We demonstrate that traditional methods are insufficient to recover the intrinsic X-ray spectrum and argue that a novel approach is required.

قيم البحث

اقرأ أيضاً

Astronomical data generally consists of 2 or more high-resolution axes, e.g., X,Y position on the sky or wavelength and position-along-one-axis (long-slit spectrometer). Analyzing these multi-dimension observations requires combining 3D source models (including velocity effects), instrument models, and multi-dimensional data comparison and fitting. A prototype of such a Beyond-XSPEC (Noble & Nowak, 2008) system is presented here using Chandra imag- ing and dispersed HETG grating data. Techniques used include: Monte Carlo event generation, chi-squared comparison, conjugate gradient fitting adapted to the Monte Carlo characteristics, and informative visualizations at each step. These simple baby steps of progress only scratch the surface of the computational potential that is available these days for astronomical analysis.
We develop a general data-driven and template-free method for the extraction of event waveforms in the presence of background noise. Recent gravitational-wave observations provide one of the significant scientific areas requiring data analysis and wa veform extraction capability. We use our method to find the waveforms for the reported events from the first, second, and third LIGO observation runs (O1, O2, and O3). Using the instantaneous frequencies derived by the Hilbert transform of the extracted waveforms, we provide the physical time delays between the arrivals of gravitational waves to the detectors.
126 - C. Tasse , B. Hugo , M. Mirmont 2017
The new generation of radio interferometers is characterized by high sensitivity, wide fields of view and large fractional bandwidth. To synthesize the deepest images enabled by the high dynamic range of these instruments requires us to take into acc ount the direction-dependent Jones matrices, while estimating the spectral properties of the sky in the imaging and deconvolution algorithms. In this paper we discuss and implement a wide-band wide-field spectral deconvolution framework (DDFacet) based on image plane faceting, that takes into account generic direction-dependent effects. Specifically, we present a wide-field co-planar faceting scheme, and discuss the various effects that need to be taken into account to solve for the deconvolution problem (image plane normalization, position-dependent PSF, etc). We discuss two wide-band spectral deconvolution algorithms based on hybrid matching pursuit and sub-space optimisation respectively. A few interesting technical features incorporated in our imager are discussed, including baseline dependent averaging, which has the effect of improving computing efficiency. The version of DDFacet presented here can account for any externally defined Jones matrices and/or beam patterns.
129 - Michael Ben-Or , Lior Eldar 2015
Inspired by the quantum computing algorithms for Linear Algebra problems [HHL,TaShma] we study how the simulation on a classical computer of this type of Phase Estimation algorithms performs when we apply it to solve the Eigen-Problem of Hermitian ma trices. The result is a completely new, efficient and stable, parallel algorithm to compute an approximate spectral decomposition of any Hermitian matrix. The algorithm can be implemented by Boolean circuits in $O(log^2 n)$ parallel time with a total cost of $O(n^{omega+1})$ Boolean operations. This Boolean complexity matches the best known rigorous $O(log^2 n)$ parallel time algorithms, but unlike those algorithms our algorithm is (logarithmically) stable, so further improvements may lead to practical implementations. All previous efficient and rigorous approaches to solve the Eigen-Problem use randomization to avoid bad condition as we do too. Our algorithm makes further use of randomization in a completely new way, taking random powers of a unitary matrix to randomize the phases of its eigenvalues. Proving that a tiny Gaussian perturbation and a random polynomial power are sufficient to ensure almost pairwise independence of the phases $(mod (2pi))$ is the main technical contribution of this work. This randomization enables us, given a Hermitian matrix with well separated eigenvalues, to sample a random eigenvalue and produce an approximate eigenvector in $O(log^2 n)$ parallel time and $O(n^omega)$ Boolean complexity. We conjecture that further improvements of our method can provide a stable solution to the full approximate spectral decomposition problem with complexity similar to the complexity (up to a logarithmic factor) of sampling a single eigenvector.
We propose a novel data-driven approach for analyzing synchrotron Laue X-ray microdiffraction scans based on machine learning algorithms. The basic architecture and major components of the method are formulated mathematically. We demonstrate it throu gh typical examples including polycrystalline BaTiO$_3$, multiphase transforming alloys and finely twinned martensite. The computational pipeline is implemented for beamline 12.3.2 at the Advanced Light Source, Lawrence Berkeley National Lab. The conventional analytical pathway for X-ray diffraction scans is based on a slow pattern by pattern crystal indexing process. This work provides a new way for analyzing X-ray diffraction 2D patterns, independent of the indexing process, and motivates further studies of X-ray diffraction patterns from the machine learning prospective for the development of suitable feature extraction, clustering and labeling algorithms.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا