We use a suite of high-resolution cosmological dwarf galaxy simulations to test the accuracy of commonly-used mass estimators from Walker et al.(2009) and Wolf et al.(2010), both of which depend on the observed line-of-sight velocity dispersion and the 2D half-light radius of the galaxy, $Re$. The simulations are part of the the Feedback in Realistic Environments (FIRE) project and include twelve systems with stellar masses spanning $10^{5} - 10^{7} M_{odot}$ that have structural and kinematic properties similar to those of observed dispersion-supported dwarfs. Both estimators are found to be quite accurate: $M_{Wolf}/M_{true} = 0.98^{+0.19}_{-0.12}$ and $M_{Walker}/M_{true} =1.07^{+0.21}_{-0.15}$, with errors reflecting the 68% range over all simulations. The excellent performance of these estimators is remarkable given that they each assume spherical symmetry, a supposition that is broken in our simulated galaxies. Though our dwarfs have negligible rotation support, their 3D stellar distributions are flattened, with short-to-long axis ratios $ c/a simeq 0.4-0.7$. The accuracy of the estimators shows no trend with asphericity. Our simulated galaxies have sphericalized stellar profiles in 3D that follow a nearly universal form, one that transitions from a core at small radius to a steep fall-off $propto r^{-4.2}$ at large $r$, they are well fit by Sersic profiles in projection. We find that the most important empirical quantity affecting mass estimator accuracy is $Re$ . Determining $Re$ by an analytic fit to the surface density profile produces a better estimated mass than if the half-light radius is determined via direct summation.