Thorough spectral study of the intrinsic single-photon detection efficiency in superconducting TaN and NbN nanowires with different widths shows that the experimental cut-off in the efficiency at near-infrared wavelengths is most likely caused by the local deficiency of Cooper pairs available for current transport. For both materials the reciprocal cut-off wavelength scales with the wire width whereas the scaling factor quantitatively agrees with the hot-spot detection models. Comparison of the experimental data with vortex-assisted detection scenarios shows that these models predict a stronger dependence of the cut-off wavelength on the wire width.