June 14, 2018, 11:00–12:15
Room MC 204
We propose a new analysis of Piyavskii's algorithm for deterministic or stochastic Lipschitz global optimization. More precisely, we address the problem of finding a global maximizer of a (locally) Lipschitz function f on [0,1]^d with known Lipschitz constant, using as few (possibly noisy) evaluations of f as possible. We first derive new regret bounds for the classical Piyavskii's algorithm in the deterministic setting; they match the best bounds known so far with other algorithms in terms of the near-optimality dimension of f. Second we extend Piyavskii's algorithm to the stochastic (noisy) setting using mini-batch sampling. For this new algorithm we prove high-probability regret bounds showing that it is minimax optimal (up to logarithmic factors) for a given near-optimality dimension.