We use a sample of 4178 Lyman break galaxies (LBGs) at z = 3, 4 and 5 in the UKIRT Infrared Deep Sky Survey (UKIDSS) Ultra Deep Survey (UDS) field to investigate the relationship between the observed slope of the stellar continuum emission in the ultraviolet, {beta}, and the thermal dust emission, as quantified via the so-called infrared excess (IRX = LIR/LUV). Through a stacking analysis we directly measure the 850-{mu}m flux density of LBGs in our deep (0.9mJy) James Clerk Maxwell Telescope (JCMT) SCUBA-2 850-{mu}m map, as well as deep public Herschel/SPIRE 250-, 350- and 500-{mu}m imaging. We establish functional forms for the IRX-{beta} relation to z ~ 5, confirming that there is no significant redshift evolution of the relation and that the resulting average IRX-{beta} curve is consistent with a Calzetti-like attenuation law. We compare our results with recent work in the literature, finding that discrepancies in the slope of the IRX-{beta} relation are driven by biases in the methodology used to determine the ultraviolet slopes. Consistent results are found when IRX-{beta} is evaluated by stacking in bins of stellar mass, M, and we argue that the near-linear IRX-M relationship is a better proxy for correcting observed UV luminosities to total star formation rates, provided an accurate handle on M can be had, and also gives clues as to the physical driver of the role of dust-obscured star formation in high-redshift galaxies.