Do you want to publish a course? Click here

Automatic Differentiation for Complex Valued SVD

155   0   0.0 ( 0 )
 Added by Shi-Xin Zhang
 Publication date 2019
  fields Physics
and research's language is English




Ask ChatGPT about the research

In this note, we report the back propagation formula for complex valued singular value decompositions (SVD). This formula is an important ingredient for a complete automatic differentiation(AD) infrastructure in terms of complex numbers, and it is also the key to understand and utilize AD in tensor networks.



rate research

Read More

As an intrinsically-unbiased method, quantum Monte Carlo (QMC) is of unique importance in simulating interacting quantum systems. Unfortunately, QMC often suffers from the notorious sign problem. Although generically curing sign problem is shown to be hard (NP-hard), sign problem of a given quantum model may be mitigated (sometimes even cured) by finding better choices of simulation scheme. A universal framework in identifying optimal QMC schemes has been desired. Here, we propose a general framework using automatic differentiation (AD) to automatically search for the best continuously-parameterized QMC scheme, which we call automatic differentiable sign mitigation (ADSM). We further apply the ADSM framework to the honeycomb lattice Hubbard model with Rashba spin-orbit coupling and demonstrate ADSMs effectiveness in mitigating its sign problem. For the model under study, ADSM leads a significant power-law acceleration in computation time (the computation time is reduced from $M$ to the order of $M^{ u}$ with $ uapprox 2/3$).
176 - Keqin Liu 2020
Based on a class of associative algebras with zero-divisors which are called real-like algebras by us, we introduce a way of defining automatic differentiation and present different ways of doing automatic differentiation to compute the first, the second and the third derivatives of a function exactly and simultaneously.
Many engineering problems involve learning hidden dynamics from indirect observations, where the physical processes are described by systems of partial differential equations (PDE). Gradient-based optimization methods are considered scalable and efficient to learn hidden dynamics. However, one of the most time-consuming and error-prone tasks is to derive and implement the gradients, especially in systems of PDEs where gradients from different systems must be correctly integrated together. To that purpose, we present a novel technique, called intelligent automatic differentiation (IAD), to leverage the modern machine learning tool $texttt{TensorFlow}$ for computing gradients automatically and conducting optimization efficiently. Moreover, IAD allows us to integrate specially designed state adjoint method codes to achieve better performance. Numerical tests demonstrate the feasibility of IAD for learning hidden dynamics in complicated systems of PDEs; additionally, by incorporating custom built state adjoint method codes in IAD, we significantly accelerate the forward and inverse simulation.
73 - Chu Guo , Dario Poletti 2020
For a real function, automatic differentiation is such a standard algorithm used to efficiently compute its gradient, that it is integrated in various neural network frameworks. However, despite the recent advances in using complex functions in machine learning and the well-established usefulness of automatic differentiation, the support of automatic differentiation for complex functions is not as well-established and widespread as for real functions. In this work we propose an efficient and seamless scheme to implement automatic differentiation for complex functions, which is a compatible generalization of the current scheme for real functions. This scheme can significantly simplify the implementation of neural networks which use complex numbers.
97 - Alberto Ramos 2020
We present ADerrors.jl, a software for linear error propagation and analysis of Monte Carlo data. Although the focus is in data analysis in Lattice QCD, where estimates of the observables have to be computed from Monte Carlo samples, the software also deals with variables with uncertainties, either correlated or uncorrelated. Thanks to automatic differentiation techniques linear error propagation is performed exactly, even in iterative algorithms (i.e. errors in parameters of non-linear fits). In this contribution we present an overview of the capabilities of the software, including access to uncertainties in fit parameters and dealing with correlated data. The software, written in julia, is available for download and use in https://gitlab.ift.uam-csic.es/alberto/aderrors.jl
comments
Fetching comments Fetching comments
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا