ﻻ يوجد ملخص باللغة العربية
Tensor networks are efficient representations of high-dimensional tensors with widespread applications in quantum many-body physics. Recently, they have been adapted to the field of machine learning, giving rise to an emergent research frontier that has attracted considerable attention. Here, we study the trainability of tensor-network based machine learning models by exploring the landscapes of different loss functions, with a focus on the matrix product states (also called tensor trains) architecture. In particular, we rigorously prove that barren plateaus (i.e., exponentially vanishing gradients) prevail in the training process of the machine learning algorithms with global loss functions. Whereas, for local loss functions the gradients with respect to variational parameters near the local observables do not vanish as the system size increases. Therefore, the barren plateaus are absent in this case and the corresponding models could be efficiently trainable. Our results reveal a crucial aspect of tensor-network based machine learning in a rigorous fashion, which provide a valuable guide for both practical applications and theoretical studies in the future.
Quantum neural networks (QNNs) have generated excitement around the possibility of efficiently analyzing quantum data. But this excitement has been tempered by the existence of exponentially vanishing gradients, known as barren plateau landscapes, fo
We argue that an excess in entanglement between the visible and hidden units in a Quantum Neural Network can hinder learning. In particular, we show that quantum neural networks that satisfy a volume-law in the entanglement entropy will give rise to
We show how to learn structures of generic, non-Markovian, quantum stochastic processes using a tensor network based machine learning algorithm. We do this by representing the process as a matrix product operator (MPO) and train it with a database of
Barren plateau landscapes correspond to gradients that vanish exponentially in the number of qubits. Such landscapes have been demonstrated for variational quantum algorithms and quantum neural networks with either deep circuits or global cost functi
Machine learning (ML) techniques applied to quantum many-body physics have emerged as a new research field. While the numerical power of this approach is undeniable, the most expressive ML algorithms, such as neural networks, are black boxes: The use