ﻻ يوجد ملخص باللغة العربية
In this paper, we give some new thoughts about the classical gradient method (GM) and recall the proposed fractional order gradient method (FOGM). It is proven that the proposed FOGM holds a super convergence capacity and a faster convergence rate around the extreme point than the conventional GM. The property of asymptotic convergence of conventional GM and FOGM is also discussed. To achieve both a super convergence capability and an even faster convergence rate, a novel switching FOGM is proposed. Moreover, we extend the obtained conclusion to a more general case by introducing the concept of p-order Lipschitz continuous gradient and p-order strong convex. Numerous simulation examples are provided to validate the effectiveness of proposed methods.
This paper proposes a fractional order gradient method for the backward propagation of convolutional neural networks. To overcome the problem that fractional order gradient method cannot converge to real extreme point, a simplified fractional order g
Safety and automatic control are extremely important when operating manipulators. For large engineering manipulators, the main challenge is to accurately recognize the posture of all arm segments. In classical sensing methods, the accuracy of an incl
In this paper, we propose a new approach, based on the so-called modulating functions to estimate the average velocity, the dispersion coefficient and the differentiation order in a space fractional advection dispersion equation. First, the average v
This paper focuses on the convergence problem of the emerging fractional order gradient descent method, and proposes three solutions to overcome the problem. In fact, the general fractional gradient method cannot converge to the real extreme point of
Nonuniformities in the imaging characteristics of modern image sensors are a primary factor in the push to develop a pixel-level generalization of the photon transfer characterization method. In this paper, we seek to develop a body of theoretical re