ترغب بنشر مسار تعليمي؟ اضغط هنا

Quantum computing for energy systems optimization: Challenges and opportunities

324   0   0.0 ( 0 )
 نشر من قبل Fengqi You
 تاريخ النشر 2020
  مجال البحث فيزياء
والبحث باللغة English




اسأل ChatGPT حول البحث

The purpose of this paper is to explore the applications of quantum computing to energy systems optimization problems and discuss some of the challenges faced by quantum computers with techniques to overcome them. The basic concepts underlying quantum computation and their distinctive characteristics in comparison to their classical counterparts are also discussed. Along with different hardware architecture description of two commercially available quantum systems, an example making use of open-source software tools is provided as a first step for diving into the new realm of programming quantum computers for solving systems optimization problems. The trade-off between qualities of these two quantum architectures is also discussed. Complex nature of energy systems due to their structure and large number of design and operational constraints make energy systems optimization a hard problem for most available algorithms. Problems like facility location allocation for energy systems infrastructure development, unit commitment of electric power systems operations, and heat exchanger network synthesis which fall under the category of energy systems optimization are solved using both classical algorithms implemented on conventional CPU based computer and quantum algorithm realized on quantum computing hardware. Their designs, implementation and results are stated. Additionally, this paper describes the limitations of state-of-the-art quantum computers and their great potential to impact the field of energy systems optimization.

قيم البحث

اقرأ أيضاً

The concept of quantum computing has inspired a whole new generation of scientists, including physicists, engineers, and computer scientists, to fundamentally change the landscape of information technology. With experimental demonstrations stretching back more than two decades, the quantum computing community has achieved a major milestone over the past few years: the ability to build systems that are stretching the limits of what can be classically simulated, and which enable cloud-based research for a wide range of scientists, thus increasing the pool of talent exploring early quantum systems. While such noisy near-term quantum computing systems fall far short of the requirements for fault-tolerant systems, they provide unique testbeds for exploring the opportunities for quantum applications. Here we highlight the facets associated with these systems, including quantum software, cloud access, benchmarking quantum systems, error correction and mitigation in such systems, and understanding the complexity of quantum circuits and how early quantum applications can run on near term quantum computers.
Computing has dramatically changed nearly every aspect of our lives, from business and agriculture to communication and entertainment. As a nation, we rely on computing in the design of systems for energy, transportation and defense; and computing fu els scientific discoveries that will improve our fundamental understanding of the world and help develop solutions to major challenges in health and the environment. Computing has changed our world, in part, because our innovations can run on computers whose performance and cost-performance has improved a million-fold over the last few decades. A driving force behind this has been a repeated doubling of the transistors per chip, dubbed Moores Law. A concomitant enabler has been Dennard Scaling that has permitted these performance doublings at roughly constant power, but, as we will see, both trends face challenges. Consider for a moment the impact of these two trends over the past 30 years. A 1980s supercomputer (e.g. a Cray 2) was rated at nearly 2 Gflops and consumed nearly 200 KW of power. At the time, it was used for high performance and national-scale applications ranging from weather forecasting to nuclear weapons research. A computer of similar performance now fits in our pocket and consumes less than 10 watts. What would be the implications of a similar computing/power reduction over the next 30 years - that is, taking a petaflop-scale machine (e.g. the Cray XK7 which requires about 500 KW for 1 Pflop (=1015 operations/sec) performance) and repeating that process? What is possible with such a computer in your pocket? How would it change the landscape of high capacity computing? In the remainder of this paper, we articulate some opportunities and challenges for dramatic performance improvements of both personal to national scale computing, and discuss some out of the box possibilities for achieving computing at this scale.
Addressing the worlds climate emergency is an uphill battle and requires a multifaceted approach including optimal deployment of green-energy alternatives. This often involve time-consuming optimisation of black-box models in a continuous parameter s pace. Despite recent advances in quantum computing, real-world applications have thus far been mostly confined to problems such as graph partitioning, traffic routing and task scheduling, where parameter space is discrete and graph connectivity is sparse. Here we propose the quantum nonlinear programming (QNLP) framework for casting an NLP problem - in continuous space - as quadratic unconstrained binary optimisation (QUBO), which can be subsequently solved using special-purpose solvers such as quantum annealers (QA) and coherent Ising machines (CIMs). QNLP consists of four steps: quadratic approximation of cost function, discretisation of parameter space, binarisation of discrete space, and solving the resulting QUBO. Linear and nonlinear constraints are incorporated into the resulting QUBO using slack variables and quadratic penalty terms. We apply our QNLP framework to optimisation of the daily feed rate of various biomass types at Nature Energy, the largest biogas producer in Europe. Optimising biomass selection improves the profitability of biomethane production, thus contributing to sustainable carbon-neutral energy production. For solving the QUBO, we use D-Waves quantum annealers. We observe good performance on the DW-2000Q QPU, and higher sensitivity of performance to number of samples and annealing time for the Advantage QPU. We hope that our proposed QNLP framework provides a meaningful step towards overcoming the computational challenges posed by high-dimensional continuous-optimisation problems, especially those encountered in our battle against man-made climate change.
With quantum computing technologies nearing the era of commercialization and quantum supremacy, machine learning (ML) appears as one of the promising killer applications. Despite significant effort, there has been a disconnect between most quantum ML proposals, the needs of ML practitioners, and the capabilities of near-term quantum devices to demonstrate quantum enhancement in the near future. In this contribution to the focus collection on What would you do with 1000 qubits?, we provide concrete examples of intractable ML tasks that could be enhanced with near-term devices. We argue that to reach this target, the focus should be on areas where ML researchers are struggling, such as generative models in unsupervised and semi-supervised learning, instead of the popular and more tractable supervised learning techniques. We also highlight the case of classical datasets with potential quantum-like statistical correlations where quantum models could be more suitable. We focus on hybrid quantum-classical approaches and illustrate some of the key challenges we foresee for near-term implementations. Finally, we introduce the quantum-assisted Helmholtz machine (QAHM), an attempt to use near-term quantum devices to tackle high-dimensional datasets of continuous variables. Instead of using quantum computers to assist deep learning, as previous approaches do, the QAHM uses deep learning to extract a low-dimensional binary representation of data, suitable for relatively small quantum processors which can assist the training of an unsupervised generative model. Although we illustrate this concept on a quantum annealer, other quantum platforms could benefit as well from this hybrid quantum-classical framework.
Services computing can offer a high-level abstraction to support diverse applications via encapsulating various computing infrastructures. Though services computing has greatly boosted the productivity of developers, it is faced with three main chall enges: privacy and security risks, information silo, and pricing mechanisms and incentives. The recent advances of blockchain bring opportunities to address the challenges of services computing due to its build-in encryption as well as digital signature schemes, decentralization feature, and intrinsic incentive mechanisms. In this paper, we present a survey to investigate the integration of blockchain with services computing. The integration of blockchain with services computing mainly exhibits merits in two aspects: i) blockchain can potentially address key challenges of services computing and ii) services computing can also promote blockchain development. In particular, we categorize the current literature of services computing based on blockchain into five types: services creation, services discovery, services recommendation, services composition, and services arbitration. Moreover, we generalize Blockchain as a Service (BaaS) architecture and summarize the representative BaaS platforms. In addition, we also outline open issues of blockchain-based services computing and BaaS.
التعليقات
جاري جلب التعليقات جاري جلب التعليقات
سجل دخول لتتمكن من متابعة معايير البحث التي قمت باختيارها
mircosoft-partner

هل ترغب بارسال اشعارات عن اخر التحديثات في شمرا-اكاديميا