ﻻ يوجد ملخص باللغة العربية
A Sinc-Nystrom method for Volterra integro-differential equations was developed by Zarebnia in 2010. The method is quite efficient in the sense that exponential convergence can be obtained even if the given problem has endpoint singularity. However, its exponential convergence has not been proved theoretically. In addition, to implement the method, the regularity of the solution is required, although the solution is an unknown function in practice. This paper reinforces the method by presenting two theoretical results: 1) the regularity of the solution is analyzed, and 2) its convergence rate is rigorously analyzed. Moreover, this paper improves the method so that a much higher convergence rate can be attained, and theoretical results similar to those listed above are provided. Numerical comparisons are also provided.
The aim of the present paper is to introduce a new numerical method for solving nonlinear Volterra integro-differential equations involving delay. We apply trapezium rule to the integral involved in the equation. Further, Daftardar-Gejji and Jafari m
In this paper we introduce a numerical method for solving nonlinear Volterra integro-differential equations. In the first step, we apply implicit trapezium rule to discretize the integral in given equation. Further, the Daftardar-Gejji and Jafari tec
A Sinc-collocation method has been proposed by Stenger, and he also gave theoretical analysis of the method in the case of a `scalar equation. This paper extends the theoretical results to the case of a `system of equations. Furthermore, this paper p
In this paper we propose and analyze a fractional Jacobi-collocation spectral method for the second kind Volterra integral equations (VIEs) with weakly singular kernel $(x-s)^{-mu},0<mu<1$. First we develop a family of fractional Jacobi polynomials,
At present, deep learning based methods are being employed to resolve the computational challenges of high-dimensional partial differential equations (PDEs). But the computation of the high order derivatives of neural networks is costly, and high ord