Numerical Analysis Group Internal Seminar

Please note that the list below only shows forthcoming events, which may not include regular events that have not yet been entered for the forthcoming term. Please see the past events page for a list of all seminar series that the department has on offer.

Past events in this series
29 May 2018
14:00
Chris Farmer
Abstract

This talk will review the main Tikhonov and Bayesian smoothing formulations of inverse problems for dynamical systems with partially observed variables and parameters. The main contenders: strong-constraint, weak-constraint and penalty function formulations will be described. The relationship between these formulations and associated optimisation problems will be revealed.  To close we will indicate techniques for maintaining sparsity and for quantifying uncertainty.

  • Numerical Analysis Group Internal Seminar
5 June 2018
14:00
Abstract

In this talk, first we  address the convergence issues of a standard finite volume element method (FVEM) applied to simple elliptic problems. Then, we discuss discontinuous finite volume element methods (DFVEM) for elliptic problems  with emphasis on  computational and theoretical  advantages over the standard FVEM. Further, we present a natural extension of DFVEM employed for the elliptic problem to the Stokes problems. We also discuss suitability of these methods for the approximation of incompressible miscible displacement problems.
 

  • Numerical Analysis Group Internal Seminar
12 June 2018
14:30
Adilet Otemissov
Abstract


(Joint work with Coralia Cartis) The problem of finding the most extreme value of a function, also known as global optimization, is a challenging task. The difficulty is associated with the exponential increase in the computational time for a linear increase in the dimension. This is known as the ``curse of dimensionality''. In this talk, we demonstrate that such challenges can be overcome for functions with low effective dimensionality --- functions which are constant along certain linear subspaces. Such functions can often be found in applications, for example, in hyper-parameter optimization for neural networks, heuristic algorithms for combinatorial optimization problems and complex engineering simulations.
We propose the use of random subspace embeddings within a(ny) global minimisation algorithm, extending the approach in Wang et al. (2013). We introduce a new framework, called REGO (Random Embeddings for GO), which transforms the high-dimensional optimization problem into a low-dimensional one. In REGO, a new low-dimensional problem is formulated with bound constraints in the reduced space and solved with any GO solver. Using random matrix theory, we provide probabilistic bounds for the success of REGO, which indicate that this is dependent upon the dimension of the embedded subspace and the intrinsic dimension of the function, but independent of the ambient dimension. Numerical results demonstrate that high success rates can be achieved with only one embedding and that rates are for the most part invariant with respect to the ambient dimension of the problem.
 

  • Numerical Analysis Group Internal Seminar
Add to My Calendar