The relation between infrared excess (IRX) and UV spectral slope ($beta_{rm UV}$) is an empirical probe of dust properties of galaxies. The shape, scatter, and redshift evolution of this relation are not well understood, however, leading to uncertainties in estimating the dust content and star formation rates (SFRs) of galaxies at high redshift. In this study, we explore the nature and properties of the IRX-$beta_{rm UV}$ relation with a sample of $z=2-6$ galaxies ($M_*approx 10^9-10^{12},M_odot$) extracted from high-resolution cosmological simulations (MassiveFIRE) of the Feedback in Realistic Environments (FIRE) project. The galaxies in our sample show an IRX-$beta_{rm UV}$ relation that is in good agreement with the observed relation in nearby galaxies. IRX is tightly coupled to the UV optical depth, and is mainly determined by the dust-to-star geometry instead of total dust mass, while $beta_{rm UV}$ is set both by stellar properties, UV optical depth, and the dust extinction law. Overall, much of the scatter in the IRX-$beta_{rm UV}$ relation of our sample is found to be driven by variations of the intrinsic UV spectral slope. We further assess how the IRX-$beta_{rm UV}$ relation depends on viewing direction, dust-to-metal ratio, birth-cloud structures, and the dust extinction law and we present a simple model that encapsulates most of the found dependencies. Consequently, we argue that the reported `deficit of the infrared/sub-millimetre bright objects at $z>5$ does not necessarily imply a non-standard dust extinction law at those epochs.