[abridged] Using stacked Sloan Digital Sky Survey spectra, we present the detection of [OII]3727,3730 nebular emission from galaxies hosting CaII and MgII absorption line systems. Both samples of absorbers, 345 CaII systems and 3461 MgII systems, span the redshift interval 0.4 < z < 1.3; all of the former and half the latter sample are expected to be bona-fide damped Lyman-alpha (DLA) absorbers. The measured star formation rate (SFR) per absorber from light falling within the SDSS fibre apertures (corresponding to physical radii of 6-9 h^-1 kpc) is 0.11-0.14 Msol/yr for the MgII-selected DLAs and 0.11-0.48 Msol/yr for the CaII absorbers. These results represent the first estimates of the average SFR in an absorption-selected galaxy population from the direct detection of nebular emission. Adopting the currently favoured model in which DLAs are large, with radii >9h^-1 kpc, and assuming no attenuation by dust, leads to the conclusion that the SFR per unit area of MgII-selected DLAs falls an order of magnitude below the predictions of the Schmidt law, which relates the SFR to the HI column density at z~0. The contribution of both DLA and CaII absorbers to the total observed star formation rate density in the redshift range 0.4 < z < 1.3, is small, <10% and <3% respectively. The result contrasts with the conclusions of Hopkins et al. that DLA absorbers can account for the majority of the total observed SFR density in the same redshift range. Our results effectively rule out a picture in which DLA absorbers are the sites in which a large fraction of the total SFR density at redshifts z < 1 occurs.