Constraining the delay-time distribution (DTD) of different supernova (SN) types can shed light on the timescales of galaxy chemical enrichment and feedback processes affecting galaxy dynamics, and SN progenitor properties. Here, we present an approach to recover SN DTDs based on integral field spectroscopy (IFS) of their host galaxies. Using a statistical analysis of a sample of 116 supernovae in 102 galaxies, we evaluate different DTD models for SN types Ia (73), II (28) and Ib/c (15). We find the best SN Ia DTD fit to be a power law with an exponent $alpha = -1.1pm 0.3$ (50% confidence interval), and a time delay (between star formation and the first SNe) $Delta = 50^{+100}_{-35}~Myr$ (50% C.I.). For core collapse (CC) SNe, both of the Zapartas et al. (2017) DTD models for single and binary stellar evolution are consistent with our results. For SNe II and Ib/c, we find a correlation with a Gaussian DTD model with $sigma = 82^{+129}_{-23}~Myr$ and $sigma = 56^{+141}_{-9}~Myr$ (50% C.I.) respectively. This analysis demonstrates that integral field spectroscopy opens a new way of studying SN DTD models in the local universe.