The Gamma Ray Burst (GRB) 180720B is one of the brightest events detected by the Fermi satellite and the first GRB detected by the H.E.S.S. telescope above 100 GeV. We analyse the Fermi (GBM and LAT) and Swift (XRT and BAT) data and describe the evolution of the burst spectral energy distribution in the 0.5 keV - 10 GeV energy range over the first 500 seconds of emission. We reveal a smooth transition from the prompt phase, dominated by synchrotron emission in a moderately fast cooling regime, to the afterglow phase whose emission has been observed from the radio to the GeV energy range. The LAT (0.1 - 100 GeV) light curve initially rises ($F_{rm LAT}propto t^{2.4}$), peaks at $sim$78 s, and falls steeply ($F_{rm LAT}propto t^{-2.2}$) afterwards. The peak, which we interpret as the onset of the fireball deceleration, allows us to estimate the bulk Lorentz factor $Gamma_{0}sim 150 (300)$ under the assumption of a wind-like (homogeneous) circum-burst medium density. We derive a flux upper limit in the LAT energy range at the time of H.E.S.S. detection, but this does not allow us to unveil the nature of the high energy component observed by H.E.S.S. We fit the prompt spectrum with a physical model of synchrotron emission from a non-thermal population of electrons. The 0 - 35 s spectrum after its $E F(E)$ peak (at 1 - 2 MeV) is a steep power law extending to hundreds of MeV. We derive a steep slope of the injected electron energy distribution $N(gamma)propto gamma^{-5}$. Our fit parameters point towards a very low magnetic field ($Bsim 1 $ G) in the emission region.