We present an optical picture of linear-optics superradiance, based on a single scattering event embedded in a dispersive effective medium composed by the other atoms. This linear-dispersion theory is valid at low density and in the single-scattering regime, i.e., when the exciting field is largely detuned. The comparison with the coupled-dipole model shows a perfect agreement for the superradiant decay rate. Then we use two advantages of this approach. First we make a direct comparison with experimental data, without any free parameter, and show a good quantitative agreement. Second, we address the problem of moving atoms, which can be efficiently simulated by adding the Doppler broadening to the theory. In particular, we discuss how to recover superradiance at high temperature.