Minimizing nonadiabaticities in optical-lattice loading


الملخص بالإنكليزية

In the quest to reach lower temperatures of ultra-cold gases in optical lattice experiments, non-adiabaticites during lattice loading are one of the limiting factors that prevent the same low temperatures to be reached as in experiments without lattice. Simulating the loading of a bosonic quantum gas into a one-dimensional optical lattice with and without a trap, we find that the redistribution of atomic density inside a global confining potential is by far the dominant source of heating. Based on these results we propose to adjust the trapping potential during loading to minimize changes to the density distribution. Our simulations confirm that a very simple linear interpolation of the trapping potential during loading already significantly decreases the heating of a quantum gas and we discuss how loading protocols minimizing density redistributions can be designed.

تحميل البحث