It has been known for nearly three decades that the energy spectra of thermonuclear X-ray bursts are often well-fit by Planck functions with temperatures so high that they imply a super-Eddington radiative flux at the emitting surface, even during portions of bursts when there is no evidence of photospheric radius expansion. This apparent inconsistency is usually set aside by assuming that the flux is actually sub-Eddington and that the fitted temperature is so high because the spectrum has been distorted by the energy-dependent opacity of the atmosphere. Here we show that the spectra predicted by currently available conventional atmosphere models appear incompatible with the highest-precision measurements of burst spectra made using the Rossi X-ray Timing Explorer, such as during the 4U 1820-30 superburst and a long burst from GX 17+2. In contrast, these measurements are well-fit by Bose-Einstein spectra with high temperatures and modest chemical potentials. Such spectra are very similar to Planck spectra. They imply surface radiative fluxes more than a factor of three larger than the Eddington flux. We find that segments of many other bursts from many sources are well-fit by similar Bose-Einstein spectra, suggesting that the radiative flux at the emitting surface also exceeds the Eddington flux during these segments. We suggest that burst spectra can closely approximate Bose-Einstein spectra and have fluxes that exceed the Eddington flux because they are formed by Comptonization in an extended, low-density radiating gas supported by the outward radiation force and confined by a tangled magnetic field.