Date
Tue, 05 Nov 2019
Time
12:45 - 14:00
Location
C5
Speaker
Adilet Otemissov
Organisation
(Oxford University)

We consider the problem of global minimization with bound constraints. The problem is known to be intractable for large dimensions due to the exponential increase in the computational time for a linear increase in the dimension (also known as the “curse of dimensionality”). In this talk, we demonstrate that such challenges can be overcome for functions with low effective dimensionality — functions which are constant along certain linear subspaces. Such functions can often be found in applications, for example, in hyper-parameter optimization for neural networks, heuristic algorithms for combinatorial optimization problems and complex engineering simulations.

Extending the idea of random subspace embeddings in Wang et al. (2013), we introduce a new framework (called REGO) compatible with any global min- imization algorithm. Within REGO, a new low-dimensional problem is for- mulated with bound constraints in the reduced space. We provide probabilistic bounds for the success of REGO; these results indicate that the success is depen- dent upon the dimension of the embedded subspace and the intrinsic dimension of the function, but independent of the ambient dimension. Numerical results show that high success rates can be achieved with only one embedding and that rates are independent of the ambient dimension of the problem.

 

Please contact us with feedback and comments about this page. Last updated on 04 Apr 2022 15:24.