Regret Analysis of Global Optimization in Univariate Functions with Lipschitz Derivatives


Abstract in English

In this work, we study the problem of global optimization in univariate loss functions, where we analyze the regret of the popular lower bounding algorithms (e.g., Piyavskii-Shubert algorithm). For any given time $T$, instead of the widely available simple regret (which is the difference of the losses between the best estimation up to $T$ and the global optimizer), we study the cumulative regret up to that time. With a suitable lower bounding algorithm, we show that it is possible to achieve satisfactory cumulative regret bounds for different classes of functions. For Lipschitz continuous functions with the parameter $L$, we show that the cumulative regret is $O(Llog T)$. For Lipschitz smooth functions with the parameter $H$, we show that the cumulative regret is $O(H)$. We also analytically extend our results for a broader class of functions that covers both the Lipschitz continuous and smooth functions individually.

Download