We consider Hermitian random band matrices $H=(h_{xy})$ on the $d$-dimensional lattice $(mathbb Z/Lmathbb Z)^d$. The entries $h_{xy}$ are independent (up to Hermitian conditions) centered complex Gaussian random variables with variances $s_{xy}=mathbb E|h_{xy}|^2$. The variance matrix $S=(s_{xy})$ has a banded structure so that $s_{xy}$ is negligible if $|x-y|$ exceeds the band width $W$. In dimensions $dge 8$, we prove that, as long as $Wge L^epsilon$ for a small constant $epsilon>0$, with high probability most bulk eigenvectors of $H$ are delocalized in the sense that their localization lengths are comparable to $L$. Denote by $G(z)=(H-z)^{-1}$ the Greens function of the band matrix. For ${mathrm Im}, zgg W^2/L^2$, we also prove a widely used criterion in physics for quantum diffusion of this model, namely, the leading term in the Fourier transform of $mathbb E|G_{xy}(z)|^2$ with respect to $x-y$ is of the form $({mathrm Im}, z + a(p))^{-1}$ for some $a(p)$ quadratic in $p$, where $p$ is the Fourier variable. Our method is based on an expansion of $T_{xy}=|m|^2 sum_{alpha}s_{xalpha}|G_{alpha y}|^2$ and it requires a self-energy renormalization up to error $W^{-K}$ for any large constant $K$ independent of $W$ and $L$. We expect that this method can be extended to non-Gaussian band matrices.