We present a novel algorithm aimed at identifying peaks within a uniformly sampled time series affected by uncorrelated Gaussian noise. The algorithm, called MEPSA (multiple excess peak search algorithm), essentially scans the time series at different timescales by comparing a given peak candidate with a variable number of adjacent bins. While this has originally been conceived for the analysis of gamma-ray burst light (GRB) curves, its usage can be readily extended to other astrophysical transient phenomena, whose activity is recorded through different surveys. We tested and validated it through simulated featureless profiles as well as simulated GRB time profiles. We showcase the algorithms potential by comparing with the popular algorithm by Li and Fenimore, that is frequently adopted in the literature. Thanks to its high flexibility, the mask of excess patterns used by MEPSA can be tailored and optimised to the kind of data to be analysed without modifying the code. The C code is made publicly available.