We investigate the impact of pulse interleaving and optical amplification on the spectral purity of microwave signals generated by photodetecting the pulsed output of an Er:fiber-based optical frequency comb. It is shown that the microwave phase noise floor can be extremely sensitive to delay length errors in the interleaver, and the contribution of the quantum noise from optical amplification to the phase noise can be reduced ~10 dB for short pulse detection. We exploit optical amplification, in conjunction with high power handling modified uni-traveling carrier photodetectors, to generate a phase noise floor on a 10 GHz carrier of -175 dBc/Hz, the lowest ever demonstrated in the photodetection of a mode-locked fiber laser. At all offset frequencies, the photodetected 10 GHz phase noise performance is comparable to or better than the lowest phase noise results yet demonstrated with stabilized Ti:sapphire frequency combs.