Testing for a large local void by investigating the Near-Infrared Galaxy Luminosity Function


Abstract in English

Recent cosmological modeling efforts have shown that a local underdensity on scales of a few hundred Mpc (out to z ~ 0.1), could produce the apparent acceleration of the expansion of the universe observed via type Ia supernovae. Several studies of galaxy counts in the near-infrared (NIR) have found that the local universe appears under-dense by ~25-50% compared with regions a few hundred Mpc distant. Galaxy counts at low redshifts sample primarily L ~ L* galaxies. Thus, if the local universe is under-dense, then the normalization of the NIR galaxy luminosity function (LF) at z>0.1 should be higher than that measured for z<0.1. Here we present a highly complete (> 90%) spectroscopic sample of 1436 galaxies selected in the H-band to study the normalization of the NIR LF at 0.1<z<0.3 and address the question of whether or not we reside in a large local underdensity. We find that for the combination of our six fields, the product phi* L* at 0.1 < z < 0.3 is ~ 30% higher than that measured at lower redshifts. While our statistical errors in this measurement are on the ~10% level, we find the systematics due to cosmic variance may be larger still. We investigate the effects of cosmic variance on our measurement using the COSMOS cone mock catalogs from the Millennium simulation and recent empirical estimates. We find that our survey is subject to systematic uncertainties due to cosmic variance at the 15% level ($1 sigma), representing an improvement by a factor of ~ 2 over previous studies in this redshift range. We conclude that observations cannot yet rule out the possibility that the local universe is under-dense at z<0.1.

Download