Since the discovery of superluminous supernovae (SLSNe) in the last decade, it has been known that these events exhibit bluer spectral energy distributions than other supernova subtypes, with significant output in the ultraviolet. However, the event Gaia16apd seems to outshine even the other SLSNe at rest-frame wavelengths below $sim 3000$ AA. Yan et al (2016) have recently presented HST UV spectra and attributed the UV flux to low metallicity and hence reduced line blanketing. Here we present UV and optical light curves over a longer baseline in time, revealing a rapid decline at UV wavelengths despite a typical optical evolution. Combining the published UV spectra with our own optical data, we demonstrate that Gaia16apd has a much hotter continuum than virtually any SLSN at maximum light, but it cools rapidly thereafter and is indistinguishable from the others by $sim 10$-15 days after peak. Comparing the equivalent widths of UV absorption lines with those of other events, we show that the excess UV continuum is a result of a more powerful central power source, rather than a lack of UV absorption relative to other SLSNe or an additional component from interaction with the surrounding medium. These findings strongly support the central-engine hypothesis for hydrogen-poor SLSNe. An explosion ejecting $M_{rm ej} = 4 (0.2/kappa)$ M$_odot$, where $kappa$ is the opacity in cm$^2$g$^{-1}$, and forming a magnetar with spin period $P=2$ ms, and $B=2times10^{14}$ G (lower than other SLSNe with comparable rise-times) can consistently explain the light curve evolution and high temperature at peak. The host metallicity, $Z=0.18$ Z$_odot$, is comparable to other SLSNe.