Date
Thu, 28 May 2026
Time
14:00 - 15:00
Location
Lecture Room 3
Speaker
Prof Luis Nunes Vicente
Organisation
Lehigh University

We introduce and analyze new probabilistic strategies for enforcing sufficient decrease conditions in stochastic derivative-free optimization, with the goal of reducing sample complexity and simplifying convergence analysis. First, we develop a new tail bound condition imposed on the estimated reduction in function value, which permits flexible selection of the power used in the sufficient decrease test, q in (1,2]. This approach allows us to reduce the number of samples per iteration from the standard O(delta^{−4}) to O(delta^{-2q}), assuming that the noise moment of order q/(q-1) is bounded. Second, we formulate the sufficient decrease condition as a sequential hypothesis testing problem, in which the algorithm adaptively collects samples until the evidence suffices to accept or reject a candidate step. This test provides statistical guarantees on decision errors and can further reduce the required sample size, particularly in the Gaussian noise setting, where it can approach O(delta^{−2-r}) when the decrease is of the order of delta^r. We incorporate both techniques into stochastic direct-search and trust-region methods for potentially non-smooth, noisy objective functions, and establish their global convergence rates and properties. 

This is joint work with Anjie Ding, Francesco Rinaldi, and Damiano Zeffiro.

 

Last updated on 2 Dec 2025, 11:23am. Please contact us with feedback and comments about this page.