The ringdown is the late part of the post-merger signature emitted during the coalescence of two black holes and comprises of a superposition of quasi-normal-modes. Within general relativity, because of the no-hair theorems, the frequencies and damping times of these modes are entirely determined by the mass and angular momentum of the final Kerr black hole. A detection of multiple ringdown modes would potentially allow us to test the no-hair theorem from observational data. The parameters which determine whether sub-dominant ringdown modes can be detected are primarily the overall signal-to-noise ratio present in the ringdown signal, and on the amplitude of the subdominant mode with respect to the dominant mode. In this paper, we use Bayesian inference to determine the detectability of a subdominant mode in a set of simulated analytical ringdown signals. Focusing on the design sensitivity of the Advanced LIGO detectors, we systematically vary the signal-to-noise ratio of the ringdown signal, and the mode amplitude ratio in order to determine what kind of signals are promising for performing black hole spectroscopy.