We present measurements of the star formation efficiency (SFE) in 3D numerical simulations of driven turbulence in supercritical, ideal-MHD, and non-magnetic regimes, characterized by their mean normalized mass-to-flux ratio $mu$, all with 64 Jeans masses and similar rms Mach numbers ($sim 10$). In most cases, the moderately supercritical runs with $mu = 2.8$ have significantly lower SFEs than the non-magnetic cases, being comparable to observational estimates for whole molecular clouds ($lesssim$ 5% over 4 Myr). Also, as the mean field is increased, the number of collapsed objects decreases, and the median mass of the collapsed objects increases. However, the largest collapsed-object masses systematically occur in the weak-field case $mu = 8.8$. The high-density tails of the density histograms in the simulations are depressed as the mean magnetic field strength is increased. This suggests that the smaller numbers and larger masses of the collapsed objects in the magnetic cases may be due to a greater scarcity and lower mean densities (implying larger Jeans masses) of the collapse candidates. In this scenario, the effect of a weak field is to reduce the probability of a core reaching its thermal Jeans mass, even if it is supercritical. We thus suggest that the SFE may be monotonically reduced as the field strength increases from zero to subcritical values, rather than there being a discontinuous transition between the sub- and supercritical regimes, and that a crucial question to address is whether the turbulence in molecular clouds is driven or decaying, with current observational and theoretical evidence favoring (albeit inconclusively) the driven regime.