ﻻ يوجد ملخص باللغة العربية
Motivated by the recent work [He-Yuan, Balanced Augmented Lagrangian Method for Convex Programming, arXiv: 2108.08554v1, (2021)], a novel Augmented Lagrangian Method (ALM) has been proposed for solving a family of convex optimization problem subject to equality or inequality constraint. This new method is then extended to solve the multi-block separable convex optimization problem, and two related primal-dual hybrid gradient algorithms are also discussed. Preliminary and some new convergence results are established with the aid of variational analysis for both the saddle point of the problem and the first-order optimality conditions of involved subproblems.
This paper is concerned with a novel deep learning method for variational problems with essential boundary conditions. To this end, we first reformulate the original problem into a minimax problem corresponding to a feasible augmented Lagrangian, whi
The magnetohydrodynamics (MHD) equations are generally known to be difficult to solve numerically, due to their highly nonlinear structure and the strong coupling between the electromagnetic and hydrodynamic variables, especially for high Reynolds an
In current work, non-familiar shifted Lucas polynomials are introduced. We have constructed a computational wavelet technique for solution of initial/boundary value second order differential equations. For this numerical scheme, we have developed wei
Subspace recycling iterative methods and other subspace augmentation schemes are a successful extension to Krylov subspace methods in which a Krylov subspace is augmented with a fixed subspace spanned by vectors deemed to be helpful in accelerating c
We propose an accurate algorithm for a novel sum-of-exponentials (SOE) approximation of kernel functions, and develop a fast algorithm for convolution quadrature based on the SOE, which allows an order $N$ calculation for $N$ time steps of approximat