Explaining the observed velocity dispersion of dwarf galaxies by baryonic mass loss during the first collapse


الملخص بالإنكليزية

In the widely adopted LambdaCDM scenario for galaxy formation, dwarf galaxies are the building blocks of larger galaxies. Since they formed at relatively early epochs when the background density was relatively high, they are expected to retain their integrity as satellite galaxies when they merge to form larger entities. Although many dwarf spheroidal galaxies (dSphs) are found in the galactic halo around the Milky Way, their phase space density (or velocity dispersion) appears to be significantly smaller than that expected for satellite dwarf galaxies in the LambdaCDM scenario. In order to account for this discrepancy, we consider the possibility that they may have lost a significant fraction of their baryonic matter content during the first infall at the Hubble expansion turnaround. Such mass loss arises naturally due to the feedback by relatively massive stars which formed in their centers briefly before the maximum contraction. Through a series of N-body simulations, we show that the timely loss of a significant fraction of the dSphs initial baryonic matter content can have profound effects on their asymptotic half-mass radius, velocity dispersion, phase-space density, and the mass fraction between residual baryonic and dark matter.

تحميل البحث