Quantum uncertainty relations are formulated in terms of relative entropy between distributions of measurement outcomes and suitable reference distributions with maximum entropy. This type of entropic uncertainty relation can be applied directly to observables with either discrete or continuous spectra. We find that a sum of relative entropies is bounded from above in a nontrivial way, which we illustrate with some examples.