The high energy spectrum of alpha particles emitted from a single isotope uniformly contaminating a bulk solid has a flat energy spectrum with a high end cutoff energy equal to the maximal alpha kinetic energy ($T_{alpha}$) of the decay. In this flat region of the spectrum, we show the surface rate $r_btext{,(Bq/keV-cm}^{2})$ arising from a bulk alpha contamination $rho_b$ (Bq/cm$^3$) from a single isotope is given by $r_b =rho_b Delta R/ 4 Delta E $, where $Delta E = E_1-E_2>0 $ is the energy interval considered (keV) in the flat region of the spectrum and $Delta R = R_2-R_1$, where $R_2$ ($R_1$) is the amount of the bulk material (cm) necessary to degrade the energy of the alpha from $T_{alpha}$ to $E_2$ ($E_1$). We compare our calculation to a rate measurement of alphas from $^{147}$Sm, ($15.32%,pm,0.03%$ of Sm($nat$) and half life of $(1.06,pm,0.01)times,10^{11} text{yr}$, and find good agreement, with the ratio between prediction to measurement of $100.2%pm 1.6%,text{(stat)}pm 2.1%,text{(sys)}$. We derive the condition for the flat spectrum, and also calculate the relationship between the decay rate measured at the surface for a [near] surface contamination with an exponential dependence on depth and an a second case of an alpha source with a thin overcoat. While there is excellent agreement between our implementation of the sophisticated Monte Carlo program SRIM and our intuitive model in all cases, both fail to describe the measured energy distribution of a $^{148}$Gd alpha source with a thin ($sim200mu$g/cm$^2$) Au overcoat. We discuss possible origins of the disagreement and suggest avenues for future study.