Recent analyses of the gamma-ray spectrum from the ultra-luminous infrared galaxy Arp 220 have revealed a discrepancy in the cosmic ray energy injection rates derived from the gamma-rays versus the radio emission. While the observed radio emission is consistent with the star formation rate inferred from infrared observations, a significantly higher cosmic ray population is necessary to accurately model the measured gamma-ray flux. To resolve this discrepancy between the radio and gamma-ray observations, we find that we must increase the cosmic ray energy injection rate and account for an infrared optical depth greater than unity. Raising the energy injection rate naturally raises the total gamma-ray flux but also raises the radio flux unless there is also an increase in the energy loss rate for cosmic ray leptons. A optically thick medium results in an increase in energy losses via inverse Compton for cosmic ray leptons and preserves agreement with submillimeter, millimeter, and infrared wavelength observations.