The interstellar dust content in galaxies can be traced in extinction at optical wavelengths, or in emission in the far-infrared. Several studies have found that radiative transfer models that successfully explain the optical extinction in edge-on spiral galaxies generally underestimate the observed FIR/submm fluxes by a factor of about three. In order to investigate this so-called dust energy balance problem, we use two Milky Way-like galaxies produced by high-resolution hydrodynamical simulations. We create mock optical edge-on views of these simulated galaxies (using the radiative transfer code SKIRT), and we then fit the parameters of a basic spiral galaxy model to these images (using the fitting code FitSKIRT). The basic model includes smooth axisymmetric distributions along a Sersic bulge and exponential disc for the stars, and a second exponential disc for the dust. We find that the dust mass recovered by the fitted models is about three times smaller than the known dust mass of the hydrodynamical input models. This factor is in agreement with previous energy balance studies of real edge-on spiral galaxies. On the other hand, fitting the same basic model to less complex input models (e.g. a smooth exponential disc with a spiral perturbation or with random clumps), does recover the dust mass of the input model almost perfectly. Thus it seems that the complex asymmetries and the inhomogeneous structure of real and hydrodynamically simulated galaxies are a lot more efficient at hiding dust than the rather contrived geometries in typical quasi-analytical models. This effect may help explain the discrepancy between the dust emission predicted by radiative transfer models and the observed emission in energy balance studies for edge-on spiral galaxies.