The radioactive decay of the freshly synthesized $r$-process nuclei ejected in compact binary mergers power optical/infrared macronovae (kilonovae) that follow these events. The light curves depend critically on the energy partition among the different products of the radioactive decay and this plays an important role in estimates of the amount of ejected $r$-process elements from a given observed signal. We study the energy partition and $gamma$-ray emission of the radioactive decay. We show that $20$-$50%$ of the total radioactive energy is released in $gamma$-rays on timescales from hours to a month. The number of emitted $gamma$-rays per unit energy interval has roughly a flat spectrum between a few dozen keV and $1$ MeV so that most of this energy is carried by $sim 1$ MeV $gamma$-rays. However at the peak of macronova emission the optical depth of the $gamma$-rays is $sim 0.02$ and most of the $gamma$-rays escape. The loss of these $gamma$-rays reduces the heat deposition into the ejecta and hence reduces the expected macronova signals if those are lanthanides dominated. This implies that the ejected mass is larger by a factor of $2$-$3$ than what was previously estimated. Spontaneous fission heats up the ejecta and the heating rate can increase if a sufficient amount of transuranic nuclei are synthesized. Direct measurements of these escaping $gamma$-rays may provide the ultimate proof for the macronova mechanisms and an identification of the $r$-process nucleosynthesis sites. However, the chances to detect these signals are slim with current X-ray and $gamma$-ray missions. New detectors, more sensitive by at least a factor of ten, are needed for a realistic detection rate.