No Arabic abstract
Learning the structure of the entanglement Hamiltonian (EH) is central to characterizing quantum many-body states in analog quantum simulation. We describe a protocol where spatial deformations of the many-body Hamiltonian, physically realized on the quantum device, serve as an efficient variational ansatz for a local EH. Optimal variational parameters are determined in a feedback loop, involving quench dynamics with the deformed Hamiltonian as a quantum processing step, and classical optimization. We simulate the protocol for the ground state of Fermi-Hubbard models in quasi-1D geometries, finding excellent agreement of the EH with Bisognano-Wichmann predictions. Subsequent on-device spectroscopy enables a direct measurement of the entanglement spectrum, which we illustrate for a Fermi Hubbard model in a topological phase.
Efficient characterization of quantum devices is a significant challenge critical for the development of large scale quantum computers. We consider an experimentally motivated situation, in which we have a decent estimate of the Hamiltonian, and its parameters need to be characterized and fine-tuned frequently to combat drifting experimental variables. We use a machine learning technique known as meta-learning to learn a more efficient optimizer for this task. We consider training with the nearest-neighbor Ising model and study the trained models generalizability to other Hamiltonian models and larger system sizes. We observe that the meta-optimizer outperforms other optimization methods in average loss over test samples. This advantage follows from the meta-optimizer being less likely to get stuck in local minima, which highly skews the distribution of the final loss of the other optimizers. In general, meta-learning decreases the number of calls to the experiment and reduces the needed classical computational resources.
The required precision to perform quantum simulations beyond the capabilities of classical computers imposes major experimental and theoretical challenges. Here, we develop a characterization technique to benchmark the implementation precision of a specific quantum simulation task. We infer all parameters of the bosonic Hamiltonian that governs the dynamics of excitations in a two-dimensional grid of nearest-neighbour coupled superconducting qubits. We devise a robust algorithm for identification of Hamiltonian parameters from measured times series of the expectation values of single-mode canonical coordinates. Using super-resolution and denoising methods, we first extract eigenfrequencies of the governing Hamiltonian from the complex time domain measurement; next, we recover the eigenvectors of the Hamiltonian via constrained manifold optimization over the orthogonal group. For five and six coupled qubits, we identify Hamiltonian parameters with sub-MHz precision and construct a spatial implementation error map for a grid of 27 qubits. Our approach enables us to distinguish and quantify the effects of state preparation and measurement errors and show that they are the dominant sources of errors in the implementation. Our results quantify the implementation accuracy of analog dynamics and introduce a diagnostic toolkit for understanding, calibrating, and improving analog quantum processors.
Hybrid classical-quantum algorithms aim at variationally solving optimisation problems, using a feedback loop between a classical computer and a quantum co-processor, while benefitting from quantum resources. Here we present experiments demonstrating self-verifying, hybrid, variational quantum simulation of lattice models in condensed matter and high-energy physics. Contrary to analog quantum simulation, this approach forgoes the requirement of realising the targeted Hamiltonian directly in the laboratory, thus allowing the study of a wide variety of previously intractable target models. Here, we focus on the Lattice Schwinger model, a gauge theory of 1D quantum electrodynamics. Our quantum co-processor is a programmable, trapped-ion analog quantum simulator with up to 20 qubits, capable of generating families of entangled trial states respecting symmetries of the target Hamiltonian. We determine ground states, energy gaps and, by measuring variances of the Schwinger Hamiltonian, we provide algorithmic error bars for energies, thus addressing the long-standing challenge of verifying quantum simulation.
We add quantum fluctuations to a classical Hamiltonian model with synchronized period doubling in the thermodynamic limit, replacing the $N$ classical interacting angular momenta with quantum spins of size $l$. The full permutation symmetry of the Hamiltonian allows a mapping to a bosonic model and the application of exact diagonalization for quite large system size. {In the thermodynamic limit $Ntoinfty$ the model is described by a system of Gross-Pitaevski equations whose classical-chaos properties closely mirror the finite-$N$ quantum chaos.} For $Ntoinfty$, and $l$ finite, Rabi oscillations mark the absence of persistent period doubling, which is recovered for $ltoinfty$ with Rabi-oscillation frequency tending exponentially to 0. For the chosen initial conditions, we can represent this model in terms of Pauli matrices and apply the discrete truncated Wigner approximation. For finite $l$ this approximation reproduces no Rabi oscillations but correctly predicts the absence of period doubling. Quantitative agreement is recovered in the classical $ltoinfty$ limit.
Solving finite-temperature properties of quantum many-body systems is generally challenging to classical computers due to their high computational complexities. In this article, we present experiments to demonstrate a hybrid quantum-classical simulation of thermal quantum states. By combining a classical probabilistic model and a 5-qubit programmable superconducting quantum processor, we prepare Gibbs states and excited states of Heisenberg XY and XXZ models with high fidelity and compute thermal properties including the variational free energy, energy, and entropy with a small statistical error. Our approach combines the advantage of classical probabilistic models for sampling and quantum co-processors for unitary transformations. We show that the approach is scalable in the number of qubits, and has a self-verifiable feature, revealing its potentials in solving large-scale quantum statistical mechanics problems on near-term intermediate-scale quantum computers.