Adaptive optics (AO) is critical in astronomy, optical communications and remote sensing to deal with the rapid blurring caused by the Earths turbulent atmosphere. But current AO systems are limited by their wavefront sensors, which need to be in an optical plane non-common to the science image and are insensitive to certain wavefront-error modes. Here we present a wavefront sensor based on a photonic lantern fibre-mode-converter and deep learning, which can be placed at the same focal plane as the science image, and is optimal for single-mode fibre injection. By measuring the intensities of an array of single-mode outputs, both phase and amplitude information on the incident wavefront can be reconstructed. We demonstrate the concept with simulations and an experimental realisation wherein Zernike wavefront errors are recovered from focal-plane measurements to a precision of $5.1times10^{-3};pi$ radians root-mean-squared-error.