ترغب بنشر مسار تعليمي؟ اضغط هنا

Markovian models for one dimensional structure estimation on heavily noisy imagery

73   0   0.0 ( 0 )
 نشر من قبل Ana Georgina Flesia MS
 تاريخ النشر 2013
والبحث باللغة English




اسأل ChatGPT حول البحث

Radar (SAR) images often exhibit profound appearance variations due to a variety of factors including clutter noise produced by the coherent nature of the illumination. Ultrasound images and infrared images have similar cluttered appearance, that make 1 dimensional structures, as edges and object boundaries difficult to locate. Structure information is usually extracted in two steps: first, building and edge strength mask classifying pixels as edge points by hypothesis testing, and secondly estimating from that mask, pixel wide connected edges. With constant false alarm rate (CFAR) edge strength detectors for speckle clutter, the image needs to be scanned by a sliding window composed of several differently oriented splitting sub-windows. The accuracy of edge location for these ratio detectors depends strongly on the orientation of the sub-windows. In this work we propose to transform the edge strength detection problem into a binary segmentation problem in the undecimated wavelet domain, solvable using parallel 1d Hidden Markov Models. For general dependency models, exact estimation of the state map becomes computationally complex, but in our model, exact MAP is feasible. The effectiveness of our approach is demonstrated on simulated noisy real-life natural images with available ground truth, while the strength of our output edge map is measured with Pratts, Baddeley an Kappa proficiency measures. Finally, analysis and experiments on three different types of SAR images, with different polarizations, resolutions and textures, illustrate that the proposed method can detect structure on SAR images effectively, providing a very good start point for active contour methods.



قيم البحث

اقرأ أيضاً

Ground-penetrating radar on planes and satellites now makes it practical to collect 3D observations of the subsurface structure of the polar ice sheets, providing crucial data for understanding and tracking global climate change. But converting these noisy readings into useful observations is generally done by hand, which is impractical at a continental scale. In this paper, we propose a computer vision-based technique for extracting 3D ice-bottom surfaces by viewing the task as an inference problem on a probabilistic graphical model. We first generate a seed surface subject to a set of constraints, and then incorporate additional sources of evidence to refine it via discrete energy minimization. We evaluate the performance of the tracking algorithm on 7 topographic sequences (each with over 3000 radar images) collected from the Canadian Arctic Archipelago with respect to human-labeled ground truth.
Digital Surface Model generation from satellite imagery is a difficult task that has been largely overlooked by the deep learning community. Stereo reconstruction techniques developed for terrestrial systems including self driving cars do not transla te well to satellite imagery where image pairs vary considerably. In this work we present neural network tailored for Digital Surface Model generation, a ground truthing and training scheme which maximizes available hardware, and we present a comparison to existing methods. The resulting models are smooth, preserve boundaries, and enable further processing. This represents one of the first attempts at leveraging deep learning in this domain.
In this paper we propose two efficient techniques which allow one to compute the price of American basket options. In particular, we consider a basket of assets that follow a multi-dimensional Black-Scholes dynamics. The proposed techniques, called G PR Tree (GRP-Tree) and GPR Exact Integration (GPR-EI), are both based on Machine Learning, exploited together with binomial trees or with a closed formula for integration. Moreover, these two methods solve the backward dynamic programming problem considering a Bermudan approximation of the American option. On the exercise dates, the value of the option is first computed as the maximum between the exercise value and the continuation value and then approximated by means of Gaussian Process Regression. The two methods mainly differ in the approach used to compute the continuation value: a single step of binomial tree or integration according to the probability density of the process. Numerical results show that these two methods are accurate and reliable in handling American options on very large baskets of assets. Moreover we also consider the rough Bergomi model, which provides stochastic volatility with memory. Despite this model is only bidimensional, the whole history of the process impacts on the price, and handling all this information is not obvious at all. To this aim, we present how to adapt the GPR-Tree and GPR-EI methods and we focus on pricing American options in this non-Markovian framework.
We present a joint copula-based model for insurance claims and sizes. It uses bivariate copulae to accommodate for the dependence between these quantities. We derive the general distribution of the policy loss without the restrictive assumption of in dependence. We illustrate that this distribution tends to be skewed and multi-modal, and that an independence assumption can lead to substantial bias in the estimation of the policy loss. Further, we extend our framework to regression models by combining marginal generalized linear models with a copula. We show that this approach leads to a flexible class of models, and that the parameters can be estimated efficiently using maximum-likelihood. We propose a test procedure for the selection of the optimal copula family. The usefulness of our approach is illustrated in a simulation study and in an analysis of car insurance policies.
As a potential window on transitions out of the ergodic, many-body-delocalized phase, we study the dephasing of weakly disordered, quasi-one-dimensional fermion systems due to a diffusive, non-Markovian noise bath. Such a bath is self-generated by th e fermions, via inelastic scattering mediated by short-ranged interactions. We calculate the dephasing of weak localization perturbatively through second order in the bath coupling. However, the expansion breaks down at long times, and is not stabilized by including a mean-field decay rate, signaling a failure of the self-consistent Born approximation. We also consider a many-channel quantum wire where short-ranged, spin-exchange interactions coexist with screened Coulomb interactions. We calculate the dephasing rate, treating the short-ranged interactions perturbatively and the Coulomb interaction exactly. The latter provides a physical infrared regularization that stabilizes perturbation theory at long times, giving the first controlled calculation of quasi-1D dephasing due to diffusive noise. At first order in the diffusive bath coupling, we find an enhancement of the dephasing rate, but at second order we find a rephasing contribution. Our results differ qualitatively from those obtained via self-consistent calculations and are relevant in two different contexts. First, in the search for precursors to many-body localization in the ergodic phase. Second, our results provide a mechanism for the enhancement of dephasing at low temperatures in spin SU(2)-symmetric quantum wires, beyond the Altshuler-Aronov-Khmelnitsky result. The enhancement is possible due to the amplification of the triplet-channel interaction strength, and provides an additional mechanism that could contribute to the experimentally observed low-temperature saturation of the dephasing time.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا