We present a study of a star formation prescription in which star formation efficiency depends on local gas density and turbulent velocity dispersion, as suggested by direct simulations of SF in turbulent giant molecular clouds (GMCs). We test the model using a simulation of an isolated Milky Way-sized galaxy with a self-consistent treatment of turbulence on unresolved scales. We show that this prescription predicts a wide variation of local star formation efficiency per free-fall time, $epsilon_{rm ff} sim 0.1 - 10%$, and gas depletion time, $t_{rm dep} sim 0.1 - 10$ Gyr. In addition, it predicts an effective density threshold for star formation due to suppression of $epsilon_{rm ff}$ in warm diffuse gas stabilized by thermal pressure. We show that the model predicts star formation rates in agreement with observations from the scales of individual star-forming regions to the kiloparsec scales. This agreement is non-trivial, as the model was not tuned in any way and the predicted star formation rates on all scales are determined by the distribution of the GMC-scale densities and turbulent velocities $sigma$ in the cold gas within the galaxy, which is shaped by galactic dynamics. The broad agreement of the star formation prescription calibrated in the GMC-scale simulations with observations, both gives credence to such simulations and promises to put star formation modeling in galaxy formation simulations on a much firmer theoretical footing.