The distributed absorption of photons in photodiodes induces an excess noise in continuous-wave photodetection above the transit-time roll-off frequency. We show that it can be treated as a frequency-dependent excess optical loss in homodyne detection. This places a limit on the bandwidth of high-accuracy homodyne detection, even if an ideal photodetector circuit is available. We experimentally verify the excess loss in two ways; a comparison of signal gain and shot-noise gain of one-port homodyne detection, and a balanced homodyne detection of squeezed light at 500 MHz sideband. These results agree with an analytic expression we develop, where the randomness of the photoabsorption is directly modeled by an intrusion of vacuum field. At 500 MHz, we estimate the excess loss at 14% for a Si-PIN photodiode with 860 nm incident light, while the numerical simulation predicts much smaller excess loss in InGaAs photodiodes with 1550 nm light.