ﻻ يوجد ملخص باللغة العربية
Let $z=(x,y)$ be coordinates for the product space $mathbb{R}^{m_1}times mathbb{R}^{m_2}$. Let $f:mathbb{R}^{m_1}times mathbb{R}^{m_2}rightarrow mathbb{R}$ be a $C^1$ function, and $ abla f=(partial _xf,partial _yf)$ its gradient. Fix $0<alpha <1$. For a point $(x,y) in mathbb{R}^{m_1}times mathbb{R}^{m_2}$, a number $delta >0$ satisfies Armijos condition at $(x,y)$ if the following inequality holds: begin{eqnarray*} f(x-delta partial _xf,y-delta partial _yf)-f(x,y)leq -alpha delta (||partial _xf||^2+||partial _yf||^2). end{eqnarray*} In one previous paper, we proposed the following {bf coordinate-wise} Armijos condition. Fix again $0<alpha <1$. A pair of positive numbers $delta _1,delta _2>0$ satisfies the coordinate-wise variant of Armijos condition at $(x,y)$ if the following inequality holds: begin{eqnarray*} [f(x-delta _1partial _xf(x,y), y-delta _2partial _y f(x,y))]-[f(x,y)]leq -alpha (delta _1||partial _xf(x,y)||^2+delta _2||partial _yf(x,y)||^2). end{eqnarray*} Previously we applied this condition for functions of the form $f(x,y)=f(x)+g(y)$, and proved various convergent results for them. For a general function, it is crucial - for being able to do real computations - to have a systematic algorithm for obtaining $delta _1$ and $delta _2$ satisfying the coordinate-wise version of Armijos condition, much like Backtracking for the usual Armijos condition. In this paper we propose such an algorithm, and prove according convergent results. We then analyse and present experimental results for some functions such as $f(x,y)=a|x|+y$ (given by Asl and Overton in connection to Wolfes method), $f(x,y)=x^3 sin (1/x) + y^3 sin(1/y)$ and Rosenbrocks function.
Let $z=(x,y)$ be coordinates for the product space $mathbb{R}^{m_1}times mathbb{R}^{m_2}$. Let $f:mathbb{R}^{m_1}times mathbb{R}^{m_2}rightarrow mathbb{R}$ be a $C^1$ function, and $ abla f=(partial _xf,partial _yf)$ its gradient. Fix $0<alpha <1$. F
Fix a constant $0<alpha <1$. For a $C^1$ function $f:mathbb{R}^krightarrow mathbb{R}$, a point $x$ and a positive number $delta >0$, we say that Armijos condition is satisfied if $f(x-delta abla f(x))-f(x)leq -alpha delta || abla f(x)||^2$. It is a
The method of block coordinate gradient descent (BCD) has been a powerful method for large-scale optimization. This paper considers the BCD method that successively updates a series of blocks selected according to a Markov chain. This kind of block s
In this work, we analyze the global convergence property of coordinate gradient descent with random choice of coordinates and stepsizes for non-convex optimization problems. Under generic assumptions, we prove that the algorithm iterate will almost s
Difference-of-Convex (DC) minimization, referring to the problem of minimizing the difference of two convex functions, has been found rich applications in statistical learning and studied extensively for decades. However, existing methods are primari