ترغب بنشر مسار تعليمي؟ اضغط هنا

Inverse problems defined on the sphere arise in many fields, and are generally high-dimensional and computationally very complex. As a result, sampling the posterior of spherical inverse problems is a challenging task. In this work, we describe a fra mework that leverages a proximal Markov chain Monte Carlo algorithm to efficiently sample the high-dimensional space of spherical inverse problems with a sparsity-promoting wavelet prior. We detail the modifications needed for the algorithm to be applied to spherical problems, and give special consideration to the crucial forward modelling step which contains spherical harmonic transforms that are computationally expensive. By sampling the posterior, our framework allows for full and flexible uncertainty quantification, something which is not possible with other methods based on, for example, convex optimisation. We demonstrate our framework in practice on a common problem in global seismic tomography. We find that our approach is potentially useful for a wide range of applications at moderate resolutions.
Imaging methods often rely on Bayesian statistical inference strategies to solve difficult imaging problems. Applying Bayesian methodology to imaging requires the specification of a likelihood function and a prior distribution, which define the Bayes ian statistical model from which the posterior distribution of the image is derived. Specifying a suitable model for a specific application can be very challenging, particularly when there is no reliable ground truth data available. Bayesian model selection provides a framework for selecting the most appropriate model directly from the observed data, without reference to ground truth data. However, Bayesian model selection requires the computation of the marginal likelihood (Bayesian evidence), which is computationally challenging, prohibiting its use in high-dimensional imaging problems. In this work we present the proximal nested sampling methodology to objectively compare alternative Bayesian imaging models, without reference to ground truth data. The methodology is based on nested sampling, a Monte Carlo approach specialised for model comparison, and exploits proximal Markov chain Monte Carlo techniques to scale efficiently to large problems and to tackle models that are log-concave and not necessarily smooth (e.g., involving L1 or total-variation priors). The proposed approach can be applied computationally to problems of dimension O(10^6) and beyond, making it suitable for high-dimensional inverse imaging problems. It is validated on large Gaussian models, for which the likelihood is available analytically, and subsequently illustrated on a range of imaging problems where it is used to analyse different choices for the sparsifying dictionary and measurement model.
This work presents the construction of a novel spherical wavelet basis designed for incomplete spherical datasets, i.e. datasets which are missing in a particular region of the sphere. The eigenfunctions of the Slepian spatial-spectral concentration problem (the Slepian functions) are a set of orthogonal basis functions which exist within a defined region. Slepian functions allow one to compute a convolution on the incomplete sphere by leveraging the recently proposed sifting convolution and extending it to any set of basis functions. Through a tiling of the Slepian harmonic line one may construct scale-discretised wavelets. An illustration is presented based on an example region on the sphere defined by the topographic map of the Earth. The Slepian wavelets and corresponding wavelet coefficients are constructed from this region, and are used in a straightforward denoising example.
Future surveys such as the Legacy Survey of Space and Time (LSST) of the Vera C. Rubin Observatory will observe an order of magnitude more astrophysical transient events than any previous survey before. With this deluge of photometric data, it will b e impossible for all such events to be classified by humans alone. Recent efforts have sought to leverage machine learning methods to tackle the challenge of astronomical transient classification, with ever improving success. Transformers are a recently developed deep learning architecture, first proposed for natural language processing, that have shown a great deal of recent success. In this work we develop a new transformer architecture, which uses multi-head self attention at its core, for general multi-variate time-series data. Furthermore, the proposed time-series transformer architecture supports the inclusion of an arbitrary number of additional features, while also offering interpretability. We apply the time-series transformer to the task of photometric classification, minimising the reliance of expert domain knowledge for feature selection, while achieving results comparable to state-of-the-art photometric classification methods. We achieve a weighted logarithmic-loss of 0.507 on imbalanced data in a representative setting using data from the Photometric LSST Astronomical Time-Series Classification Challenge (PLAsTiCC). Moreover, we achieve a micro-averaged receiver operating characteristic area under curve of 0.98 and micro-averaged precision-recall area under curve of 0.87.
We develop variational regularization methods which leverage sparsity-promoting priors to solve severely ill posed inverse problems defined on the 3D ball (i.e. the solid sphere). Our method solves the problem natively on the ball and thus does not s uffer from discontinuities that plague alternate approaches where each spherical shell is considered independently. Additionally, we leverage advances in probability density theory to produce Bayesian variational methods which benefit from the computational efficiency of advanced convex optimization algorithms, whilst supporting principled uncertainty quantification. We showcase these variational regularization and uncertainty quantification techniques on an illustrative example. The C++ code discussed throughout is provided under a GNU general public license.
Inverse problems defined naturally on the sphere are becoming increasingly of interest. In this article we provide a general framework for evaluation of inverse problems on the sphere, with a strong emphasis on flexibility and scalability. We conside r flexibility with respect to the prior selection (regularization), the problem definition - specifically the problem formulation (constrained/unconstrained) and problem setting (analysis/synthesis) - and optimization adopted to solve the problem. We discuss and quantify the trade-offs between problem formulation and setting. Crucially, we consider the Bayesian interpretation of the unconstrained problem which, combined with recent developments in probability density theory, permits rapid, statistically principled uncertainty quantification (UQ) in the spherical setting. Linearity is exploited to significantly increase the computational efficiency of such UQ techniques, which in some cases are shown to permit analytic solutions. We showcase this reconstruction framework and UQ techniques on a variety of spherical inverse problems. The code discussed throughout is provided under a GNU general public license, in both C++ and Python.
Convolutional neural networks (CNNs) constructed natively on the sphere have been developed recently and shown to be highly effective for the analysis of spherical data. While an efficient framework has been formulated, spherical CNNs are nevertheles s highly computationally demanding; typically they cannot scale beyond spherical signals of thousands of pixels. We develop scattering networks constructed natively on the sphere that provide a powerful representational space for spherical data. Spherical scattering networks are computationally scalable and exhibit rotational equivariance, while their representational space is invariant to isometries and provides efficient and stable signal representations. By integrating scattering networks as an additional type of layer in the generalized spherical CNN framework, we show how they can be leveraged to scale spherical CNNs to the high-resolution data typical of many practical applications, with spherical signals of many tens of megapixels and beyond.
We present a framework for the optimal filtering of spherical signals contaminated by realizations of an additive, zero-mean, uncorrelated and anisotropic noise process on the sphere. Filtering is performed in the wavelet domain given by the scale-di scretized wavelet transform on the sphere. The proposed filter is optimal in the sense that it minimizes the mean square error between the filtered wavelet representation and wavelet representation of the noise-free signal. We also present a simplified formulation of the filter for the case when azimuthally symmetric wavelet functions are used. We demonstrate the use of the proposed optimal filter for denoising of an Earth topography map in the presence of additive, zero-mean, uncorrelated and white Gaussian noise, and show that the proposed filter performs better than the hard thresholding method and weighted spherical harmonic~(weighted-SPHARM) signal estimation framework.
To date weak gravitational lensing surveys have typically been restricted to small fields of view, such that the $textit{flat-sky approximation}$ has been sufficiently satisfied. However, with Stage IV surveys ($textit{e.g. LSST}$ and $textit{Euclid} $) imminent, extending mass-mapping techniques to the sphere is a fundamental necessity. As such, we extend the sparse hierarchical Bayesian mass-mapping formalism presented in previous work to the spherical sky. For the first time, this allows us to construct $textit{maximum a posteriori}$ spherical weak lensing dark-matter mass-maps, with principled Bayesian uncertainties, without imposing or assuming Gaussianty. We solve the spherical mass-mapping inverse problem in the analysis setting adopting a sparsity promoting Laplace-type wavelet prior, though this theoretical framework supports all log-concave posteriors. Our spherical mass-mapping formalism facilitates principled statistical interpretation of reconstructions. We apply our framework to convergence reconstruction on high resolution N-body simulations with pseudo-Euclid masking, polluted with a variety of realistic noise levels, and show a significant increase in reconstruction fidelity compared to standard approaches. Furthermore we perform the largest joint reconstruction to date of the majority of publicly available shear observational datasets (combining DESY1, KiDS450 and CFHTLens) and find that our formalism recovers a convergence map with significantly enhanced small-scale detail. Within our Bayesian framework we validate, in a statistically rigorous manner, the communitys intuition regarding the need to smooth spherical Kaiser-Squires estimates to provide physically meaningful convergence maps. Such approaches cannot reveal the small-scale physical structures that we recover within our framework.
Weak lensing convergence maps - upon which higher order statistics can be calculated - can be recovered from observations of the shear field by solving the lensing inverse problem. For typical surveys this inverse problem is ill-posed (often seriousl y) leading to substantial uncertainty on the recovered convergence maps. In this paper we propose novel methods for quantifying the Bayesian uncertainty in the location of recovered features and the uncertainty in the cumulative peak statistic - the peak count as a function of signal to noise ratio (SNR). We adopt the sparse hierarchical Bayesian mass-mapping framework developed in previous work, which provides robust reconstructions and principled statistical interpretation of reconstructed convergence maps without the need to assume or impose Gaussianity. We demonstrate our uncertainty quantification techniques on both Bolshoi N-body (cluster scale) and Buzzard V-1.6 (large scale structure) N-body simulations. For the first time, this methodology allows one to recover approximate Bayesian upper and lower limits on the cumulative peak statistic at well defined confidence levels.
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا