ﻻ يوجد ملخص باللغة العربية
Quantum simulations of Fermi-Hubbard models have been attracting considerable efforts in the optical lattice research, with the ultracold anti-ferromagnetic atomic phase reached at half filling in recent years. An unresolved issue is to dope the system while maintaining the low thermal entropy. Here we propose to achieve the low temperature phase of the doped Fermi-Hubbard model using incommensurate optical lattices through adiabatic quantum evolution. In this theoretical proposal, we find that one major problem about the adiabatic doping that shows up is atomic localization in the incommensurate lattice, potentially causing exponential slowing down of the adiabatic procedure. We study both one- and two-dimensional incommensurate optical lattices, and find that the localization prevents efficient adiabatic doping in the strong lattice regime for both cases. With density matrix renormalization group calculation, we further show that the slowing down problem in one dimension can be circumvented by considering interaction induced many-body delocalization, which is experimentally feasible using Feshbach resonance techniques. This protocol is expected to be efficient as well in two dimensions where the localization phenomenon is less stable.
Over the last years the exciting developments in the field of ultracold atoms confined in optical lattices have led to numerous theoretical proposals devoted to the quantum simulation of problems e.g. known from condensed matter physics. Many of thos
Quantum simulations with ultra-cold atoms in optical lattices open up an exciting path towards understanding strongly interacting quantum systems. Atom gas microscopes are crucial for this as they offer single-site density resolution, unparalleled in
Adiabatic quantum optimization has been proposed as a route to solve NP-complete problems, with a possible quantum speedup compared to classical algorithms. However, the precise role of quantum effects, such as entanglement, in these optimization pro
Ultra-cold atoms in optical lattices provide one of the most promising platforms for analog quantum simulations of complex quantum many-body systems. Large-size systems can now routinely be reached and are already used to probe a large variety of dif
We experimentally realize Rydberg excitations in Bose-Einstein condensates of rubidium atoms loaded into quasi one-dimensional traps and in optical lattices. Our results for condensates expanded to different sizes in the one-dimensional trap agree we