ترغب بنشر مسار تعليمي؟ اضغط هنا

Deconvolution under Poisson noise using exact data fidelity and synthesis or analysis sparsity priors

95   0   0.0 ( 0 )
 نشر من قبل Francois-Xavier Dupe
 تاريخ النشر 2011
  مجال البحث الاحصاء الرياضي
والبحث باللغة English




اسأل ChatGPT حول البحث

In this paper, we propose a Bayesian MAP estimator for solving the deconvolution problems when the observations are corrupted by Poisson noise. Towards this goal, a proper data fidelity term (log-likelihood) is introduced to reflect the Poisson statistics of the noise. On the other hand, as a prior, the images to restore are assumed to be positive and sparsely represented in a dictionary of waveforms such as wavelets or curvelets. Both analysis and synthesis-type sparsity priors are considered. Piecing together the data fidelity and the prior terms, the deconvolution problem boils down to the minimization of non-smooth convex functionals (for each prior). We establish the well-posedness of each optimization problem, characterize the corresponding minimizers, and solve them by means of proximal splitting algorithms originating from the realm of non-smooth convex optimization theory. Experimental results are conducted to demonstrate the potential applicability of the proposed algorithms to astronomical imaging datasets.



قيم البحث

اقرأ أيضاً

Group therapy is a central treatment modality for behavioral health disorders such as alcohol and other drug use (AOD) and depression. Group therapy is often delivered under a rolling (or open) admissions policy, where new clients are continuously en rolled into a group as space permits. Rolling admissions policies result in a complex correlation structure among client outcomes. Despite the ubiquity of rolling admissions in practice, little guidance on the analysis of such data is available. We discuss the limitations of previously proposed approaches in the context of a study that delivered group cognitive behavioral therapy for depression to clients in residential substance abuse treatment. We improve upon previous rolling group analytic approaches by fully modeling the interrelatedness of client depressive symptom scores using a hierarchical Bayesian model that assumes a conditionally autoregressive prior for session-level random effects. We demonstrate improved performance using our method for estimating the variance of model parameters and the enhanced ability to learn about the complex correlation structure among participants in rolling therapy groups. Our approach broadly applies to any group therapy setting where groups have changing client composition. It will lead to more efficient analyses of client-level data and improve the group therapy research communitys ability to understand how the dynamics of rolling groups lead to client outcomes.
In this paper, we propose two algorithms for solving linear inverse problems when the observations are corrupted by Poisson noise. A proper data fidelity term (log-likelihood) is introduced to reflect the Poisson statistics of the noise. On the other hand, as a prior, the images to restore are assumed to be positive and sparsely represented in a dictionary of waveforms. Piecing together the data fidelity and the prior terms, the solution to the inverse problem is cast as the minimization of a non-smooth convex functional. We establish the well-posedness of the optimization problem, characterize the corresponding minimizers, and solve it by means of primal and primal-dual proximal splitting algorithms originating from the field of non-smooth convex optimization theory. Experimental results on deconvolution and comparison to prior methods are also reported.
77 - Dohyun Chun , Donggyu Kim 2021
Recently, to account for low-frequency market dynamics, several volatility models, employing high-frequency financial data, have been developed. However, in financial markets, we often observe that financial volatility processes depend on economic st ates, so they have a state heterogeneous structure. In this paper, to study state heterogeneous market dynamics based on high-frequency data, we introduce a novel volatility model based on a continuous Ito diffusion process whose intraday instantaneous volatility process evolves depending on the exogenous state variable, as well as its integrated volatility. We call it the state heterogeneous GARCH-Ito (SG-Ito) model. We suggest a quasi-likelihood estimation procedure with the realized volatility proxy and establish its asymptotic behaviors. Moreover, to test the low-frequency state heterogeneity, we develop a Wald test-type hypothesis testing procedure. The results of empirical studies suggest the existence of leverage, investor attention, market illiquidity, stock market comovement, and post-holiday effect in S&P 500 index volatility.
Suppose we have a Bayesian model which combines evidence from several different sources. We want to know which model parameters most affect the estimate or decision from the model, or which of the parameter uncertainties drive the decision uncertaint y. Furthermore we want to prioritise what further data should be collected. These questions can be addressed by Value of Information (VoI) analysis, in which we estimate expected reductions in loss from learning specific parameters or collecting data of a given design. We describe the theory and practice of VoI for Bayesian evidence synthesis, using and extending ideas from health economics, computer modelling and Bayesian design. The methods are general to a range of decision problems including point estimation and choices between discrete actions. We apply them to a model for estimating prevalence of HIV infection, combining indirect information from several surveys, registers and expert beliefs. This analysis shows which parameters contribute most of the uncertainty about each prevalence estimate, and provides the expected improvements in precision from collecting specific amounts of additional data.
Superconducting microwave resonators are reliable circuits widely used for detection and as test devices for material research. A reliable determination of their external and internal quality factors is crucial for many modern applications, which eit her require fast measurements or operate in the single photon regime with small signal to noise ratios. Here, we use the circle fit technique with diameter correction and provide a step by step guide for implementing an algorithm for robust fitting and calibration of complex resonator scattering data in the presence of noise. The speedup and robustness of the analysis are achieved by employing an algebraic rather than an iterative fit technique for the resonance circle.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا