ﻻ يوجد ملخص باللغة العربية
We propose a general framework to recover underlying images from noisy phaseless diffraction measurements based on the alternating directional method of multipliers and the plug-and-play technique. The algorithm consists of three-step iterations: (i) Solving a generalized least square problem with the maximum a posteriori (MAP) estimate of the noise, (ii) Gaussian denoising and (iii) updating the multipliers. The denoising step utilizes higher order filters such as total generalized variation and nonlocal sparsity based filters including nonlocal mean (NLM) and Block-matching and 3D filtering (BM3D) filters. The multipliers are updated by a symmetric technique to increase convergence speed. The proposed method with low computational complexity is provided with theoretical convergence guarantee, and it enables recovering images with sharp edges, clean background and repetitive features from noisy phaseless measurements. Numerous numerical experiments for Fourier phase retrieval (PR) as coded diffraction and ptychographic patterns are performed to verify the convergence and efficiency, showing that our proposed method outperforms the state-of-art PR algorithms without any regularization and those with total variational regularization.
Phaseless diffraction measurements recorded by a CCD detector are often affected by Poisson noise. In this paper, we propose a dictionary learning model by employing patches based sparsity to denoise Poisson phaseless measurement. The model consists
For years, there has been interest in approximation methods for solving dynamic programming problems, because of the inherent complexity in computing optimal solutions characterized by Bellmans principle of optimality. A wide range of approximate dyn
One of the challenges in analyzing a learning algorithm is the circular entanglement between the objective value and the stochastic noise. This is also known as the chicken and egg phenomenon. Traditionally, people tackle this issue with the special
This paper studies a strategy for data-driven algorithm design for large-scale combinatorial optimization problems that can leverage existing state-of-the-art solvers in general purpose ways. The goal is to arrive at new approaches that can reliably
Benchmarks in the utility function have various interpretations, including performance guarantees and risk constraints in fund contracts and reference levels in cumulative prospect theory. In most literature, benchmarks are a deterministic constant o