Date
Thu, 23 Jan 2014
Time
14:00 - 15:00
Location
L5
Speaker
Professor Luis Nunes Vicente
Organisation
University of Coimbra

Direct-search methods are a class of popular derivative-free

algorithms characterized by evaluating the objective function

using a step size and a number of (polling) directions.

When applied to the minimization of smooth functions, the

polling directions are typically taken from positive spanning sets

which in turn must have at least n+1 vectors in an n-dimensional variable space.

In addition, to ensure the global convergence of these algorithms,

the positive spanning sets used throughout the iterations

must be uniformly non-degenerate in the sense of having a positive

(cosine) measure bounded away from zero.

\\

\\

However, recent numerical results indicated that randomly generating

the polling directions without imposing the positive spanning property

can improve the performance of these methods, especially when the number

of directions is chosen considerably less than n+1.

\\

\\

In this talk, we analyze direct-search algorithms when the polling

directions are probabilistic descent, meaning that with a certain

probability at least one of them is of descent type. Such a framework

enjoys almost-sure global convergence. More interestingly, we will show

a global decaying rate of $1/\sqrt{k}$ for the gradient size, with

overwhelmingly high probability, matching the corresponding rate for

the deterministic versions of the gradient method or of direct search.

Our analysis helps to understand numerical behavior and the choice of

the number of polling directions.

\\

\\

This is joint work with Clément Royer, Serge Gratton, and Zaikun Zhang.

Please contact us with feedback and comments about this page. Last updated on 03 Apr 2022 01:32.