14:30
Forthcoming events in this series
14:30
14:00
Analysis of Magnus expansion methods in the semiclassical regime
Abstract
Magnus expansion based methods are an efficient class of integrators for solving Schrödinger equations that feature time dependent potentials such as lasers. These methods have been found to be highly effective in computational quantum chemistry since the pioneering work of Tal Ezer and Kosloff in the early 90s. The convergence of the Magnus expansion, however, is usually understood only for ODEs and traditional analysis suggests a much poorer performance of these methods than observed experimentally. It was not till the work of Hochbruck and Lubich in 2003 that a rigorous analysis justifying the application to PDEs with unbounded operators, such as the Schrödinger equation, was presented. In this talk we will extend this analysis to the semiclassical regime, where the highly oscillatory solution conventionally suggests large errors and a requirement for very small time steps.
14:30
New approaches for global optimization methods
Abstract
We present some dimensionality reduction techniques for global optimization algorithms, in order to increase their scalability. Inspired by ideas in machine learning, and extending the approach of random projections in Zhang et al (2016), we present some new algorithmic approaches for global optimisation with theoretical guarantees of good behaviour and encouraging numerical results.
14:00
Derivative-free optimisation methods for nonlinear least-squares problems
Abstract
Derivative-free optimisation (DFO) algorithms are a category of optimisation methods for situations when one is unable to compute or estimate derivatives of the objective, such as when the objective has noise or is very expensive to evaluate. In this talk I will present a flexible DFO framework for unconstrained nonlinear least-squares problems, and in particular discuss its performance on noisy problems.
14:30
The 2017 Problem Solving Squad
Abstract
Each year Prof. Trefethen gives the Problem Solving Squad a sequence of problems with no hints, one a week, where the solution of each problem is a single real number to be computed by any method available. We will present this year's three problems, involving (1) an S-shaped bifurcation curve, (2) shortest path around a random web, and (3) switching a time-varying system to maximize a matrix norm.
The 14 students this year are Simon Vary plus InFoMM cohort 2: Matteo Croci, Davin Lunz, Michael McPhail, Tori Pereira, Lindon Roberts, Caoimhe Rooney, Ian Roper, Thomas Roy, Tino Sulzer, Bogdan Toader, Florian Wechsung, Jess Williams, and Fabian Ying. The presentations will be by (1) Lindon Roberts, (2) Florian Wechsung, and (3) Thomas Roy.
14:00
Sparse Kerdock matrices for compressive sensing
Abstract
Delsarte-Goethals frames are a popular choice for deterministic measurement matrices in compressive sensing. I will show that it is possible to construct extremely sparse matrices which share precisely the same row space as Delsarte-Goethals frames. I will also describe the combinatorial block design underlying the construction and make a connection to Steiner equiangular tight frames.
14:30
14:00
Random functions in Chebfun
Abstract
What's the continuous analog of randn? In recent months I've been exploring such questions with Abdul-Lateef Haji-Ali and other members of the Chebfun team, and Chebfun now has commands randnfun, randnfun2, randnfunsphere, and randnfundisk. These are based on finite Fourier series with random coefficients, and interesting questions arise in the "white noise" limit as the lengths of the series approaches infinity and as random ODEs become stochastic DEs. This work is at an early stage and we are grateful for input from stochastic experts at Oxford and elsewhere.
14:30
Ill-conditioning and numerical stability in radial basis functions (RBFs) using frame theory
Abstract
We analyse the numerical approximation of functions using radial basis functions in the context of frames. Frames generalize the notion of a basis by allowing redundancy, while being restricted by a so-called frame condition. The theory of numerical frame approximations allows the study of ill-conditioning, inherently due to their redundancy, and suggests discretization techniques that still offer numerical stability to machine precision. We apply the theory to radial basis functions.
14:00
Computation of the joint spectral radius by optimization techniques
Abstract
The joint spectral radius (JSR) of a set of matrices characterizes the maximum growth rate that can be achieved by multiplying them in arbitrary order. This concept, which essentially generalizes the notion of the "largest eigenvalue" from one matrix to many, was introduced by Rota and Strang in the early 60s and has since emerged in many areas of application such as stability of switched linear systems, computation of the capacity of codes, convergence of consensus algorithms, tracability of graphs, and many others. The JSR is a very difficult quantity to compute even for a pair of matrices. In this talk, we present optimization-based algorithms (e.g., via semidefinite programming or dynamic programming) that can either compute the JSR exactly in special cases or approximate it with arbitrary prescribed accuracy in the general case.
Based on joint work (in different subsets) with Raphael Jungers, Pablo Parrilo, and Mardavij Roozbehani.
14:30
14:00
Nonconvex geometry of low-rank optimizations
Abstract
The past few years have seen a surge of interest in nonconvex reformulations of convex optimizations using nonlinear reparameterizations of the optimization variables. Compared with the convex formulations, the nonconvex ones typically involve many fewer variables, allowing them to scale to scenarios with millions of variables. However, one pays the price of solving nonconvex optimizations to global optimality, which is generally believed to be impossible. In this talk, I will characterize the nonconvex geometries of several low-rank matrix optimizations. In particular, I will argue that under reasonable assumptions, each critical point of the nonconvex problems either corresponds to the global optimum of the original convex optimizations, or is a strict saddle point where the Hessian matrix has a negative eigenvalue. Such a geometric structure ensures that many local search algorithms can converge to the global optimum with random initializations. Our analysis is based on studying how the convex geometries are transformed under nonlinear parameterizations.
14:30
Improving the rational Remez algorithm via adaptive barycentric representations
14:00
14:30
14:00
Efficient DC algorithm for sparse optimization
Abstract
In various research fields such as machine learning, compressed sensing and operations research, optimization problems which seek sparsity of solutions by the cardinality constraint or rank constraint are studied. We formulate such problems as DC (Difference of two Convex functions) optimization problems and apply DC Algorithm (DCA) to them. While a subproblem needs to be solved in each DCA iteration, its closed-form solution can be easily obtained by soft-thresholding operation. Numerical experiments demonstrate the efficiency of the proposed DCA in comparison with existing methods.
This is a joint work with J. Gotoh (Chuo Univ.) and K. Tono (U. Tokyo).
14:00
Radial basis function approximation of solutions to elliptic PDEs
14:00
Perturbation of higher-order singular values
Abstract
Joint work with Wolfgang Hackbusch and Daniel Kressner
14:30
A robust parallel algorithm for combinatorial compressed sensing
14:00
Learning sparse additive models with interactions in high dimensions
14:30
Quadrature rules and rational approximations to the exponential function
14:00
Maximizing the AUC function for classification using black-box optimization methods
14:30
Sync-Rank: Robust ranking, constrained ranking and rank aggregation via eigenvector and SDP synchronization
Abstract
We consider the classic problem of establishing a statistical ranking of a set of n items given a set of inconsistent and incomplete pairwise comparisons between such items. Instantiations of this problem occur in numerous applications in data analysis (e.g., ranking teams in sports data), computer vision, and machine learning. We formulate the above problem of ranking with incomplete noisy information as an instance of the group synchronization problem over the group SO(2) of planar rotations, whose usefulness has been demonstrated in numerous applications in recent years. Its least squares solution can be approximated by either a spectral or a semidefinite programming (SDP) relaxation, followed by a rounding procedure. We perform extensive numerical simulations on both synthetic and real-world data sets (Premier League soccer games, a Halo 2 game tournament and NCAA College Basketball games) showing that our proposed method compares favorably to other algorithms from the recent literature.
We propose a similar synchronization-based algorithm for the rank-aggregation problem, which integrates in a globally consistent ranking pairwise comparisons given by different rating systems on the same set of items. We also discuss the problem of semi-supervised ranking when there is available information on the ground truth rank of a subset of players, and propose an algorithm based on SDP which recovers the ranks of the remaining players. Finally, synchronization-based ranking, combined with a spectral technique for the densest subgraph problem, allows one to extract locally-consistent partial rankings, in other words, to identify the rank of a small subset of players whose pairwise comparisons are less noisy than the rest of the data, which other methods are not able to identify.
14:00
Interpolation and quadrature in perturbed points
Abstract
The trigonometric interpolants to a periodic function f in equispaced points converge if f is Dini-continuous, and the associated quadrature formula, the trapezoidal rule, converges if f is continuous. What if the points are perturbed? Amazingly little has been done on this problem, or on its algebraic (i.e. nonperiodic) analogue. I will present new results joint with Anthony Austin which show some surprises.