Do you want to publish a course? Click here

Analytical singular value decomposition for a class of stoichiometry matrices

315   0   0.0 ( 0 )
 Added by David Bortz
 Publication date 2021
  fields Biology
and research's language is English




Ask ChatGPT about the research

We present the analytical singular value decomposition of the stoichiometry matrix for a spatially discrete reaction-diffusion system on a one dimensional domain. The domain has two subregions which share a single common boundary. Each of the subregions is further partitioned into a finite number of compartments. Chemical reactions can occur within a compartment, whereas diffusion is represented as movement between adjacent compartments. Inspired by biology, we study both 1) the case where the reactions on each side of the boundary are different and only certain species diffuse across the boundary as well as 2) the case with spatially homogenous reactions and diffusion. We write the stoichiometry matrix for these two classes of systems using a Kronecker product formulation. For the first scenario, we apply linear perturbation theory to derive an approximate singular value decomposition in the limit as diffusion becomes much faster than reactions. For the second scenario, we derive an exact analytical singular value decomposition for all relative diffusion and reaction time scales. By writing the stoichiometry matrix using Kronecker products, we show that the singular vectors and values can also be written concisely using Kronecker products. Ultimately, we find that the singular value decomposition of the reaction-diffusion stoichiometry matrix depends on the singular value decompositions of smaller matrices. These smaller matrices represent modifie



rate research

Read More

Our goal here is to see the space of matrices of a given size from a geometric and topological perspective, with emphasis on the families of various ranks and how they fit together. We pay special attention to the nearest orthogonal neighbor and nearest singular neighbor of a given matrix, both of which play central roles in matrix decompositions, and then against this visual backdrop examine the polar and singular value decompositions and some of their applications.
108 - Christian Bongiorno 2020
I show analytically that the average of $k$ bootstrapped correlation matrices rapidly becomes positive-definite as $k$ increases, which provides a simple approach to regularize singular Pearson correlation matrices. If $n$ is the number of objects and $t$ the number of features, the averaged correlation matrix is almost surely positive-definite if $k> frac{e}{e-1}frac{n}{t}simeq 1.58frac{n}{t}$ in the limit of large $t$ and $n$. The probability of obtaining a positive-definite correlation matrix with $k$ bootstraps is also derived for finite $n$ and $t$. Finally, I demonstrate that the number of required bootstraps is always smaller than $n$. This method is particularly relevant in fields where $n$ is orders of magnitude larger than the size of data points $t$, e.g., in finance, genetics, social science, or image processing.
The hierarchical SVD provides a quasi-best low rank approximation of high dimensional data in the hierarchical Tucker framework. Similar to the SVD for matrices, it provides a fundamental but expensive tool for tensor computations. In the present work we examine generalizations of randomized matrix decomposition methods to higher order tensors in the framework of the hierarchical tensors representation. In particular we present and analyze a randomized algorithm for the calculation of the hierarchical SVD (HSVD) for the tensor train (TT) format.
82 - Ehsan Rohani , Gwan Choi , Mi Lu 2017
In this paper we introduce the algorithm and the fixed point hardware to calculate the normalized singular value decomposition of a non-symmetric matrices using Givens fast (approximate) rotations. This algorithm only uses the basic combinational logic modules such as adders, multiplexers, encoders, Barrel shifters (B-shifters), and comparators and does not use any lookup table. This method in fact combines the iterative properties of singular value decomposition method and CORDIC method in one single iteration. The introduced architecture is a systolic architecture that uses two different types of processors, diagonal and non-diagonal processors. The diagonal processor calculates, transmits and applies the horizontal and vertical rotations, while the non-diagonal processor uses a fully combinational architecture to receive, and apply the rotations. The diagonal processor uses priority encoders, Barrel shifters, and comparators to calculate the rotation angles. Both processors use a series of adders to apply the rotation angles. The design presented in this work provides $2.83sim649$ times better energy per matrix performance compared to the state of the art designs. This performance achieved without the employment of pipelining; a better performance advantage is expected to be achieved employing pipelining.
This paper introduces the functional tensor singular value decomposition (FTSVD), a novel dimension reduction framework for tensors with one functional mode and several tabular modes. The problem is motivated by high-order longitudinal data analysis. Our model assumes the observed data to be a random realization of an approximate CP low-rank functional tensor measured on a discrete time grid. Incorporating tensor algebra and the theory of Reproducing Kernel Hilbert Space (RKHS), we propose a novel RKHS-based constrained power iteration with spectral initialization. Our method can successfully estimate both singular vectors and functions of the low-rank structure in the observed data. With mild assumptions, we establish the non-asymptotic contractive error bounds for the proposed algorithm. The superiority of the proposed framework is demonstrated via extensive experiments on both simulated and real data.
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا