14:00
Igusa stacks and the cohomology of Shimura varieties
Abstract
Sharp error bounds for approximate eigenvalues and singular values from subspace methods
Abstract
Irina-Beatrice Haas will talk about; 'Sharp error bounds for approximate eigenvalues and singular values from subspace methods'
Subspace methods are commonly used for finding approximate eigenvalues and singular values of large-scale matrices. Once a subspace is found, the Rayleigh-Ritz method (for symmetric eigenvalue problems) and Petrov-Galerkin projection (for singular values) are the de facto method for extraction of eigenvalues and singular values. In this work we derive error bounds for approximate eigenvalues obtained via the Rayleigh-Ritz process. Our bounds are quadratic in the residual corresponding to each Ritz value while also being robust to clustered Ritz values, which is a key improvement over existing results. We apply these bounds to several methods for computing eigenvalues and singular values, including Krylov methods and randomized algorithms.
The latent variable proximal point algorithm for variational problems with inequality constraints
Abstract
General Matrix Optimization
Abstract
Casey Garner will talk about; 'General Matrix Optimization'
Since our early days in mathematics, we have been aware of two important characteristics of a matrix, namely, its coordinates and its spectrum. We have also witnessed the growth of matrix optimization models from matrix completion to semidefinite programming; however, only recently has the question of solving matrix optimization problems with general spectral and coordinate constraints been studied. In this talk, we shall discuss recent work done to study these general matrix optimization models and how they relate to topics such as Riemannian optimization, approximation theory, and more.