Past Numerical Analysis Group Internal Seminar

23 May 2017
14:30
Problem Solving Squad (Roberts, Wechsung, Roy et al.)
Abstract

Each year Prof. Trefethen gives the Problem Solving Squad a sequence of problems with no hints, one a week, where the solution of each problem is a single real number to be computed by any method available.  We will present this year's three problems, involving (1) an S-shaped bifurcation curve, (2) shortest path around a random web, and (3) switching a time-varying system to maximize a matrix norm.

 

The 14 students this year are Simon Vary plus InFoMM cohort 2: Matteo Croci, Davin Lunz, Michael McPhail, Tori Pereira, Lindon Roberts, Caoimhe Rooney, Ian Roper, Thomas Roy, Tino Sulzer, Bogdan Toader, Florian Wechsung, Jess Williams, and Fabian Ying.  The presentations will be by (1) Lindon Roberts, (2) Florian Wechsung, and (3) Thomas Roy.

  • Numerical Analysis Group Internal Seminar
23 May 2017
14:00
Andrew Thompson
Abstract

Delsarte-Goethals frames are a popular choice for deterministic measurement matrices in compressive sensing. I will show that it is possible to construct extremely sparse matrices which share precisely the same row space as Delsarte-Goethals frames. I will also describe the combinatorial block design underlying the construction and make a connection to Steiner equiangular tight frames.
 

  • Numerical Analysis Group Internal Seminar
16 May 2017
14:00
Nick Trefethen
Abstract

What's the continuous analog of randn?  In recent months I've been exploring such questions with Abdul-Lateef Haji-Ali and other members of the Chebfun team, and Chebfun now has commands randnfun, randnfun2, randnfunsphere, and randnfundisk.  These are based on finite Fourier series with random coefficients, and interesting questions arise in the "white noise" limit as the lengths of the series approaches infinity and as random ODEs become stochastic DEs.    This work is at an early stage and we are grateful for input from stochastic experts at Oxford and elsewhere.

  • Numerical Analysis Group Internal Seminar
9 May 2017
14:30
Abstract

We analyse the numerical approximation of functions using radial basis functions in the context of frames. Frames generalize the notion of a basis by allowing redundancy, while being restricted by a so-called frame condition. The theory of numerical frame approximations allows the study of ill-conditioning, inherently due to their redundancy, and suggests discretization techniques that still offer numerical stability to machine precision. We apply the theory to radial basis functions.

 

  • Numerical Analysis Group Internal Seminar
9 May 2017
14:00
Amirali Ahmadi
Abstract


The joint spectral radius (JSR) of a set of matrices characterizes the maximum growth rate that can be achieved by multiplying them in arbitrary order. This concept, which essentially generalizes the notion of the "largest eigenvalue" from one matrix to many, was introduced by Rota and Strang in the early 60s and has since emerged in many areas of application such as stability of switched linear systems, computation of the capacity of codes, convergence of consensus algorithms, tracability of graphs, and many others. The JSR is a very difficult quantity to compute even for a pair of matrices. In this talk, we present optimization-based algorithms (e.g., via semidefinite programming or dynamic programming) that can either compute the JSR exactly in special cases or approximate it with arbitrary prescribed accuracy in the general case.

Based on joint work (in different subsets) with Raphael Jungers, Pablo Parrilo, and Mardavij Roozbehani.
 

  • Numerical Analysis Group Internal Seminar
2 May 2017
14:00
Abstract

The past few years have seen a surge of interest in nonconvex reformulations of convex optimizations using nonlinear reparameterizations of the optimization variables. Compared with the convex formulations, the nonconvex ones typically involve many fewer variables, allowing them to scale to scenarios with millions of variables. However, one pays the price of solving nonconvex optimizations to global optimality, which is generally believed to be impossible. In this talk, I will characterize the nonconvex geometries of several low-rank matrix optimizations. In particular, I will argue that under reasonable assumptions, each critical point of the nonconvex problems either corresponds to the global optimum of the original convex optimizations, or is a strict saddle point where the Hessian matrix has a negative eigenvalue. Such a geometric structure ensures that many local search algorithms can converge to the global optimum with random initializations. Our analysis is based on studying how the convex geometries are transformed under nonlinear parameterizations.

  • Numerical Analysis Group Internal Seminar

Pages