A tight relation between the [CII]158$mu$m line luminosity and star formation rate is measured in local galaxies. At high redshift ($z>5$), though, a much larger scatter is observed, with a considerable (15-20%) fraction of the outliers being [CII]-deficient. Moreover, the [CII] surface brightness ($Sigma_{rm CII}$) of these sources is systematically lower than expected from the local relation. To clarify the origin of such [CII]-deficiency we have developed an analytical model that fits local [CII] data, and has been validated against radiative transfer simulations performed with CLOUDY. The model predicts an overall increase of $Sigma_{rm CII}$ with the surface star formation rate ($Sigma_*$). However, for $Sigma_* > 1 M_odot~{rm yr}^{-1}~{rm kpc}^{-2}$, $Sigma_{rm CII}$ saturates. We conclude that underluminous [CII] systems can result from a combination of three factors: (a) large upward deviations from the Kennicutt-Schmidt relation ($kappa_s gg 1$), parameterized by the burstiness parameter $kappa_s$; (b) low metallicity; (c) low gas density, at least for the most extreme sources (e.g. CR7). Observations of [CII] emission alone cannot break the degeneracy among the above three parameters; this requires additional information coming from other emission lines (e.g. [OIII]88$mu$m, CIII]1909A, CO lines). Simple formulae are given to interpret available data for low and high-$z$ galaxies.