No Arabic abstract
There have been multiple attempts to demonstrate that quantum annealing and, in particular, quantum annealing on quantum annealing machines, has the potential to outperform current classical optimization algorithms implemented on CMOS technologies. The benchmarking of these devices has been controversial. Initially, random spin-glass problems were used, however, these were quickly shown to be not well suited to detect any quantum speedup. Subsequently, benchmarking shifted to carefully crafted synthetic problems designed to highlight the quantum nature of the hardware while (often) ensuring that classical optimization techniques do not perform well on them. Even worse, to date a true sign of improved scaling with the number of problem variables remains elusive when compared to classical optimization techniques. Here, we analyze the readiness of quantum annealing machines for real-world application problems. These are typically not random and have an underlying structure that is hard to capture in synthetic benchmarks, thus posing unexpected challenges for optimization techniques, both classical and quantum alike. We present a comprehensive computational scaling analysis of fault diagnosis in digital circuits, considering architectures beyond D-wave quantum annealers. We find that the instances generated from real data in multiplier circuits are harder than other representative random spin-glass benchmarks with a comparable number of variables. Although our results show that transverse-field quantum annealing is outperformed by state-of-the-art classical optimization algorithms, these benchmark instances are hard and small in the size of the input, therefore representing the first industrial application ideally suited for testing near-term quantum annealers and other quantum algorithmic strategies for optimization problems.
Although the current information revolution is still unfolding, the next industrial revolution is already rearing its head. A second quantum revolution based on quantum technology will power this new industrial revolution with quantum computers as its engines. The development of quantum computing will turn quantum theory into quantum technology, hence release the power of quantum phenomena, and exponentially accelerate the progress of science and technology. Building a large-scale quantum computing is at the juncture of science and engineering. Even if large-scale quantum computers become reality, they cannot make the conventional computers obsolete soon. Building a large-scale quantum computer is a daunting complex engineering problem to integrate ultra-low temperature with room temperature and micro-world with macro-world. We have built hundreds of physical qubits already but are still working on logical and topological qubits. Since physical qubits cannot tolerate errors, they cannot be used to perform long precise calculations to solve practically useful problems yet.
Recent years have enjoyed an overwhelming interest in quantum thermodynamics, a field of research aimed at understanding thermodynamic tasks performed in the quantum regime. Further progress, however, seems to be obstructed by the lack of experimental implementations of thermal machines in which quantum effects play a decisive role. In this work, we introduce a blueprint of quantum field machines, which - once experimentally realized - would fill this gap. Even though the concept of the QFM presented here is very general and can be implemented in any many body quantum system that can be described by a quantum field theory. We provide here a detailed proposal how to realize a quantum machine in one-dimensional ultra-cold atomic gases, which consists of a set of modular operations giving rise to a piston. These can then be coupled sequentially to thermal baths, with the innovation that a quantum field takes up the role of the working fluid. In particular, we propose models for compression on the system to use it as a piston, and coupling to a bath that gives rise to a valve controlling heat flow. These models are derived within Bogoliubov theory, which allows us to study the operational primitives numerically in an efficient way. By composing the numerically modelled operational primitives we design complete quantum thermodynamic cycles that are shown to enable cooling and hence giving rise to a quantum field refrigerator. The active cooling achieved in this way can operate in regimes where existing cooling methods become ineffective. We describe the consequences of operating the machine at the quantum level and give an outlook of how this work serves as a road map to explore open questions in quantum information, quantum thermodynamic and the study of non-Markovian quantum dynamics.
This article provides an overview of the current state of digital rock technology, with emphasis on industrial applications. We show how imaging and image analysis can be applied for rock typing and modeling of end-point saturations. Different methods to obtain a digital model of the pore space from pore scale images are presented, and the strengths and weaknesses of the different methods are discussed. We also show how imaging bridges the different subjects of geology, petrophysics and reservoir simulations, by being a common denominator for results in all these subjects. Network modeling is compared to direct simulations on grid models, and their respective strengths are discussed. Finally we present an example of digital rock technology applied to a sandstone oil reservoir. Results from digital rock modeling are compared to results from traditional laboratory experiments. We highlight the mutual benefits from conducting both traditional experiments and digital rock modeling.
The seminal work by Sadi Carnot in the early nineteenth century provided the blueprint of a reversible heat engine and the celebrated second law of thermodynamics eventually followed. Almost two centuries later, the quest to formulate a quantum theory of the thermodynamic laws has thus unsurprisingly motivated physicists to visualise what are known as `quantum thermal machines (QTMs). In this article, we review the prominent developments achieved in the theoretical construction as well as understanding of QTMs, beginning from the formulation of their earliest prototypes to recent models. We also present a detailed introduction and highlight recent progress in the rapidly developing field of `quantum batteries.
Quantum annealing (QA) is a hardware-based heuristic optimization and sampling method applicable to discrete undirected graphical models. While similar to simulated annealing, QA relies on quantum, rather than thermal, effects to explore complex search spaces. For many classes of problems, QA is known to offer computational advantages over simulated annealing. Here we report on the ability of recent QA hardware to accelerate training of fully visible Boltzmann machines. We characterize the sampling distribution of QA hardware, and show that in many cases, the quantum distributions differ significantly from classical Boltzmann distributions. In spite of this difference, training (which seeks to match data and model statistics) using standard classical gradient updates is still effective. We investigate the use of QA for seeding Markov chains as an alternative to contrastive divergence (CD) and persistent contrastive divergence (PCD). Using $k=50$ Gibbs steps, we show that for problems with high-energy barriers between modes, QA-based seeds can improve upon chains with CD and PCD initializations. For these hard problems, QA gradient estimates are more accurate, and allow for faster learning. Furthermore, and interestingly, even the case of raw QA samples (that is, $k=0$) achieved similar improvements. We argue that this relates to the fact that we are training a quantum rather than classical Boltzmann distribution in this case. The learned parameters give rise to hardware QA distributions closely approximating classical Boltzmann distributions that are hard to train with CD/PCD.