We apply a fundamental definition of time delay, as the difference between the time a particle spends within a finite region of a potential and the time a free particle spends in the same region, to determine results for photoionization of an electron by an extreme ultraviolet (XUV) laser field using numerical simulations on a grid. Our numerical results are in good agreement with those of the Wigner-Smith time delay, obtained as the derivative of the phase shift of the scattering wave packet with respect to its energy, for the short-range Yukawa potential. In case of the Coulomb potential we obtain time delays for any finite region, while - as expected - the results do not converge as the size of the region increases towards infinity. The impact of an ultrashort near-infrared probe pulse on the time delay is analyzed for both the Yukawa as well as the Coulomb potential and is found to be small for intensities below $10^{13}$ W/cm$^2$.