We show that supersonic MHD turbulence yields a star formation rate (SFR) as low as observed in molecular clouds (MCs), for characteristic values of the free-fall time divided by the dynamical time, $t_{rm ff}/t_{rm dyn}$, the alfv{e}nic Mach number, ${cal M}_{rm a}$, and the sonic Mach number, ${cal M}_{rm s}$. Using a very large set of deep adaptive-mesh-refinement simulations, we quantify the dependence of the SFR per free-fall time, $epsilon_{rm ff}$, on the above parameters. Our main results are: i) $epsilon_{rm ff}$ decreases exponentially with increasing $t_{rm ff}/t_{rm dyn}$, but is insensitive to changes in ${cal M}_{rm s}$, for constant values of $t_{rm ff}/t_{rm dyn}$ and ${cal M}_{rm a}$. ii) Decreasing values of ${cal M}_{rm a}$ (stronger magnetic fields) reduce $epsilon_{rm ff}$, but only to a point, beyond which $epsilon_{rm ff}$ increases with a further decrease of ${cal M}_{rm a}$. iii) For values of ${cal M}_{rm a}$ characteristic of star-forming regions, $epsilon_{rm ff}$ varies with ${cal M}_{rm a}$ by less than a factor of two. We propose a simple star-formation law, based on the empirical fit to the minimum $epsilon_{rm ff}$, and depending only on $t_{rm ff}/t_{rm dyn}$: $epsilon_{rm ff} approx epsilon_{rm wind} exp(-1.6 ,t_{rm ff}/t_{rm dyn})$. Because it only depends on the mean gas density and rms velocity, this law is straightforward to implement in simulations and analytical models of galaxy formation and evolution.