Let $f$ be a band-limited function in $L^2({mathbb{R}})$. Fix $T >0$ and suppose $f^{prime}$ exists and is integrable on $[-T, T]$. This paper gives a concrete estimate of the error incurred when approximating $f$ in the root mean square by a partial sum of its Hermite series. Specifically, we show, for $K=2n, quad n in Z_+,$ $$ left[frac{1}{2T}int_{-T}^T[f(t)-(S_Kf)(t)]^2dtright]^{1/2}leq left(1+frac 1Kright)left(left[ frac{1}{2T}int_{|t|> T}f(t)^2dtright]^{1/2} +left[frac{1}{2T} int_{|omega|>N}|hat f(omega)|^2domegaright]^{1/2} right) +frac{1}{K}left[frac{1}{2T}int_{|t|leq T}f_N(t)^2dtright]^{1/2} +frac{1}{pi}left(1+frac{1}{2K}right)S_a(K,T), $$ in which $S_Kf$ is the $K$-th partial sum of the Hermite series of $f, hat f $ is the Fourier transform of $f$, $displaystyle{N=frac{sqrt{2K+1}+% sqrt{2K+3}}{2}}$ and $f_N=(hat f chi_{(-N,N)})^vee(t)=frac{1}{pi}int_{-infty}^{infty}frac{sin (N(t-s))}{t-s}f(s)ds$. An explicit upper bound is obtained for $S_{a}(K,T)$.