Multiple scales analysis of a conductive-radiative thermal transfer model
Abstract
Multiple scales analysis is a powerful asymptotic technique for problems where the solution depends on two scales of widely different sizes. Standard multiple scales involves the introduction of a macroscale and microscale which are assumed to be independent. A common (and usually acceptable) assumption is that when considering behaviour on the microscale, the macroscale variable can be taken as constant, however there are instances where this assumption is not valid. In this talk, I will explain one such situation, that is, when considering conductive-radiative thermal transfer within a solid matrix with spherical perforations and discuss the appropriate measures when converting the radiative boundary condition into multiple-scales form.
11:00
Hilbert's Fifth Problem
Abstract
Hilbert's fifth problem asks informally what is the difference between Lie groups and topological groups. In 1950s this problem was solved by Andrew Gleason, Deane Montgomery, Leo Zippin and Hidehiko Yamabe concluding that every locally compact topological group is "essentially" a Lie group. In this talk we will show the complete proof of this theorem.
11:30
The (non-uniform) Hrushovski-Lang-Weil estimates
Abstract
In 1996 using techniques from model theory and intersection theory, Hrushovski obtained a generalisation of the Lang-Weil estimates. Subsequently the estimates have found applications in group theory, algebraic dynamics and algebraic geometry. We shall discuss a geometric proof of the non-uniform version of these estimates.
Optimisation of 1D Piecewise Smooth Functions
Abstract
Optimisation in 1D is far simpler than multidimensional optimisation and this is largely due to the notion of a bracket. A bracket is a trio of points such that the middle point is the one with the smallest objective function value (of the three). The existence of a bracket is sufficient to guarantee that a continuous function has a local minimum within the bracket. The most stable 1D optimisation methods, such as Golden Section or Brent's Method, make use of this fact. The mentality behind these methods is to maintain a bracket at all times, all the while finding smaller brackets until the local minimum can be guaranteed to lie within a sufficiently small range. For smooth functions, Brent's method in particular converges quickly with a minimum of function evaluations required. However, when applied to a piece-wise smooth functions, it achieves its realistic worst case convergence rate. In this presentation, I will present a new method which uses ideas from Brent and Golden Section, while being designed to converge quickly for piece-wise smooth functions.