Do you want to publish a course? Click here

Gram quadrature: numerical integration with Gram polynomials

360   0   0.0 ( 0 )
 Added by Irfan Muhammad
 Publication date 2021
and research's language is English




Ask ChatGPT about the research

The numerical integration of an analytical function $f(x)$ using a finite set of equidistant points can be performed by quadrature formulas like the Newton-Cotes. Unlike Gaussian quadrature formulas however, higher-order Newton-Cotes formulas are not stable, limiting the usable order of such formulas. Existing work showed that by the use of orthogonal polynomials, stable high-order quadrature formulas with equidistant points can be developed. We improve upon such work by making use of (orthogonal) Gram polynomials and deriving an iterative algorithm, together allowing us to reduce the space-complexity of the original algorithm significantly.



rate research

Read More

The parallel strong-scaling of Krylov iterative methods is largely determined by the number of global reductions required at each iteration. The GMRES and Krylov-Schur algorithms employ the Arnoldi algorithm for nonsymmetric matrices. The underlying orthogonalization scheme is left-looking and processes one column at a time. Thus, at least one global reduction is required per iteration. The traditional algorithm for generating the orthogonal Krylov basis vectors for the Krylov-Schur algorithm is classical Gram Schmidt applied twice with reorthogonalization (CGS2), requiring three global reductions per step. A new variant of CGS2 that requires only one reduction per iteration is applied to the Arnoldi-QR iteration. Strong-scaling results are presented for finding eigenvalue-pairs of nonsymmetric matrices. A preliminary attempt to derive a similar algorithm (one reduction per Arnoldi iteration with a robust orthogonalization scheme) was presented by Hernandez et al.(2007). Unlike our approach, their method is not forward stable for eigenvalues.
169 - Tomoaki Okayama 2013
The Sinc quadrature and the Sinc indefinite integration are approximation formulas for definite integration and indefinite integration, respectively, which can be applied on any interval by using an appropriate variable transformation. Their convergence rates have been analyzed for typical cases including finite, semi-infinite, and infinite intervals. In addition, for verified automatic integration, more explicit error bounds that are computable have been recently given on a finite interval. In this paper, such explicit error bounds are given in the remaining cases on semi-infinite and infinite intervals.
Background: The inception of next generations sequencing technologies have exponentially increased the volume of biological sequence data. Protein sequences, being quoted as the `language of life, has been analyzed for a multitude of applications and inferences. Motivation: Owing to the rapid development of deep learning, in recent years there have been a number of breakthroughs in the domain of Natural Language Processing. Since these methods are capable of performing different tasks when trained with a sufficient amount of data, off-the-shelf models are used to perform various biological applications. In this study, we investigated the applicability of the popular Skip-gram model for protein sequence analysis and made an attempt to incorporate some biological insights into it. Results: We propose a novel $k$-mer embedding scheme, Align-gram, which is capable of mapping the similar $k$-mers close to each other in a vector space. Furthermore, we experiment with other sequence-based protein representations and observe that the embeddings derived from Align-gram aids modeling and training deep learning models better. Our experiments with a simple baseline LSTM model and a much complex CNN model of DeepGoPlus shows the potential of Align-gram in performing different types of deep learning applications for protein sequence analysis.
A t by n random matrix A is formed by sampling n independent random column vectors, each containing t components. The random Gram matrix of size n, G_n, contains the dot products between all pairs of column vectors in the randomly generated matrix A; that is, G_n = transpose(A) A. The matrix G_n has characteristic roots coinciding with the singular values of A. Furthermore, the sequences det(G_i) and per(G_i) (for i = 0, 1, ..., n) are factors that comprise the expected coefficients of the characteristic and permanental polynomials of G_n. We prove theorems that relate the generating functions and recursions for the traces of matrix powers, expected characteristic coefficients, expected determinants E(det(G_n)), and expected permanents E(per(G_n)) in terms of each other. Using the derived recursions, we exhibit the efficient computation of the expected determinant and expected permanent of a random Gram matrix G_n, formed according to any underlying distribution. These theoretical results may be used both to speed up numerical algorithms and to investigate the numerical properties of the expected characteristic and permanental coefficients of any matrix comprised of independently sampled columns.
Zernike polynomials are a basis of orthogonal polynomials on the unit disk that are a natural basis for representing smooth functions. They arise in a number of applications including optics and atmospheric sciences. In this paper, we provide a self-contained reference on Zernike polynomials, algorithms for evaluating them, and what appear to be new numerical schemes for quadrature and interpolation. We also introduce new properties of Zernike polynomials in higher dimensions. The quadrature rule and interpolation scheme use a tensor product of equispaced nodes in the angular direction and roots of certain Jacobi polynomials in the radial direction. An algorithm for finding the roots of these Jacobi polynomials is also described. The performance of the interpolation and quadrature schemes is illustrated through numerical experiments. Discussions of higher dimensional Zernike polynomials are included in appendices.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا