The response of the antenna is a source of uncertainty in measurements with the Experiment to Detect the Global EoR Signature (EDGES). We aim to validate the beam model of the low-band (50-100 MHz) dipole antenna with comparisons between models and against data. We find that simulations of a simplified model of the antenna over an infinite perfectly conducting ground plane are, with one exception, robust to changes of numerical electromagnetic solver code or algorithm. For simulations of the antenna with the actual finite ground plane and realistic soil properties, we find that two out of three numerical solvers agree well. Applying our analysis pipeline to a simulated driftscan observation from an early EDGES low-band instrument that had a 10 m $times$ 10 m ground plane, we find residual levels after fitting and removing a five-term foreground model to data binned in Local Sidereal Time (LST) average about 250 mK with $pm$40 mK variation between numerical solvers. A similar analysis of the primary 30 m $times$ 30 m sawtooth ground plane reduced the LST-averaged residuals to about 90 mK with $pm$10 mK between the two viable solvers. More broadly we show that larger ground planes generally perform better than smaller ground planes. Simulated data have a power which is within 4$%$ of real observations, a limitation of net accuracy of the sky and beam models. We observe that residual spectral structures after foreground model fits match qualitatively between simulated data and observations, suggesting that the frequency dependence of the beam is reasonably represented by the models. We find that soil conductivity of 0.02 Sm$^{-1}$ and relative permittivity of 3.5 yield good agreement between simulated spectra and observations. This is consistent with the soil properties reported by Sutinjo et al. (2015) for the Murchison Radio-astronomy Observatory, where EDGES is located.