Star forming galaxies emit GeV- and TeV-gamma rays that are thought to originate from hadronic interactions of cosmic-ray (CR) nuclei with the interstellar medium. To understand the emission, we have used the moving mesh code Arepo to perform magneto-hydrodynamical galaxy formation simulations with self-consistent CR physics. Our galaxy models exhibit a first burst of star formation that injects CRs at supernovae. Once CRs have sufficiently accumulated in our Milky-Way like galaxy, their buoyancy force overcomes the magnetic tension of the toroidal disk field. As field lines open up, they enable anisotropically diffusing CRs to escape into the halo and to accelerate a bubble-like, CR-dominated outflow. However, these bubbles are invisible in our simulated gamma-ray maps of hadronic pion-decay and secondary inverse-Compton emission because of low gas density in the outflows. By adopting a phenomenological relation between star formation rate (SFR) and far-infrared emission and assuming that gamma rays mainly originate from decaying pions, our simulated galaxies can reproduce the observed tight relation between far-infrared and gamma-ray emission, independent of whether we account for anisotropic CR diffusion. This demonstrates that uncertainties in modeling active CR transport processes only play a minor role in predicting gamma-ray emission from galaxies. We find that in starbursts, most of the CR energy is calorimetrically lost to hadronic interactions. In contrast, the gamma-ray emission deviates from this calorimetric property at low SFRs due to adiabatic losses, which cannot be identified in traditional one-zone models.