A theory of the fluctuation-induced Nernst effect is developed for arbitrary magnetic fields and temperatures beyond the upper critical field line in a two-dimensional superconductor. First, we derive a simple phenomenological formula for the Nernst coefficient, which naturally explains the giant Nernst signal due to fluctuating Cooper pairs. The latter is shown to be large even far from the transition and may exceed by orders of magnitude the Fermi liquid terms. We also present a complete microscopic calculation (which includes quantum fluctuations) of the Nernst coefficient and give its asymptotic dependencies in various regions on the phase diagram. It is argued that the magnitude and the behavior of the Nernst signal observed experimentally in disordered superconducting films can be well-understood on the basis of the superconducting fluctuation theory.