We show that the sparse polynomial interpolation problem reduces to a discrete super-resolution problem on the $n$-dimensional torus. Therefore the semidefinite programming approach initiated by Cand`es & Fernandez-Granda cite{candes_towards_2014} in the univariate case can be applied. We extend their result to the multivariate case, i.e., we show that exact recovery is guaranteed provided that a geometric spacing condition on the supports holds and the number of evaluations are sufficiently many (but not many). It also turns out that the sparse recovery LP-formulation of $ell_1$-norm minimization is also guaranteed to provide exact recovery {it provided that} theevaluations are made in a certain manner and even though the Restricted Isometry Property for exact recovery is not satisfied. (A naive sparse recovery LP-approach does not offer such a guarantee.) Finally we also describe the algebraic Prony method for sparse interpolation, which also recovers the exact decomposition but from less point evaluations and with no geometric spacing condition. We provide two sets of numerical experiments, one in which the super-resolution technique and Pronys method seem to cope equally well with noise, and another in which the super-resolution technique seems to cope with noise better than Pronys method, at the cost of an extra computational burden (i.e. a semidefinite optimization).