Date
Tue, 28 Jan 2020
14:30
Location
L5
Speaker
Adilet Otemissov
Organisation
Oxford

We show that the scalability challenges of Global Optimisation algorithms can be overcome for functions with low effective dimensionality, which are constant along certain linear subspaces. Such functions can often be found in applications, for example, in hyper-parameter optimization for neural networks, heuristic algorithms for combinatorial optimization problems and complex engineering simulations. We propose the use of random subspace embeddings within a(ny) global minimisation algorithm, extending the approach in Wang et al. (2016). Using tools from random matrix theory and conic integral geometry, we investigate the efficacy and convergence of our random subspace embeddings approach, in a static and/or adaptive formulation. We illustrate our algorithmic proposals and theoretical findings numerically, using state of the art global solvers. This work is joint with Coralia Cartis.
 

Please contact us with feedback and comments about this page. Last updated on 03 Apr 2022 01:32.