We use a sample of star-forming field and protocluster galaxies at z=2.0-2.5 with Keck/MOSFIRE K-band spectra, a wealth of rest-frame UV photometry, and Spitzer/MIPS and Herschel/PACS observations, to dissect the relation between the ratio of IR to UV luminosity (IRX) versus UV slope ($beta$) as a function of gas-phase metallicity (12+log(O/H)~8.2-8.7). We find no significant dependence of the IRX-$beta$ trend on environment. However, we find that at a given $beta$, IRX is highly correlated with metallicity, and less correlated with mass, age, and sSFR. We conclude that, of the physical properties tested here, metallicity is the primary physical cause of the IRX-$beta$ scatter, and the IRX correlation with mass is presumably due to the mass dependence on metallicity. Our results indicate that the UV attenuation curve steepens with decreasing metallicity, and spans the full range of slope possibilities from a shallow Calzetti-type curve for galaxies with the highest metallicity in our sample (12+log(O/H)~8.6) to a steep SMC-like curve for those with 12+log(O/H)~8.3. Using a Calzetti (SMC) curve for the low (high) metallicity galaxies can lead to up to a factor of 3 overestimation (underestimation) of the UV attenuation and obscured SFR. We speculate that this change is due to different properties of dust grains present in the ISM of low- and high-metallicity galaxies.