ﻻ يوجد ملخص باللغة العربية
This paper describes a new efficient conjugate subgradient algorithm which minimizes a convex function containing a least squares fidelity term and an absolute value regularization term. This method is successfully applied to the inversion of ill-conditioned linear problems, in particular for computed tomography with the dictionary learning method. A comparison with other state-of-art methods shows a significant reduction of the number of iterations, which makes this algorithm appealing for practical use.
Bayesian inference is a widely used and powerful analytical technique in fields such as astronomy and particle physics but has historically been underutilized in some other disciplines including semiconductor devices. In this work, we introduce Bayes
A simple computer-based algorithm has been developed to identify pre-modern coins minted from the same dies, intending mainly coins minted by hand-made dies designed to be applicable to images taken from auction websites or catalogs. Though the metho
We consider minimization of functions that are compositions of convex or prox-regular functions (possibly extended-valued) with smooth vector functions. A wide variety of important optimization problems fall into this framework. We describe an algori
A stochastic incremental subgradient algorithm for the minimization of a sum of convex functions is introduced. The method sequentially uses partial subgradient information and the sequence of partial subgradients is determined by a general Markov ch
Deep learning is a rapidly-evolving technology with possibility to significantly improve physics reach of collider experiments. In this study we developed a novel algorithm of vertex finding for future lepton colliders such as the International Linea