16:00
16:00
Extragradient Methods for Modern Machine Learning: New Convergence Guarantees, Step-Size Rules, and Stochastic Variants
Abstract
Extragradient methods are a fundamental class of algorithms for solving min-max optimization problems and variational inequalities. While the classical theory is largely developed under smoothness and other relatively restrictive assumptions, many problems arising in modern machine learning call for analysis in weaker regularity regimes and in stochastic large-scale settings. In this talk, we present new convergence results for deterministic and stochastic extragradient methods beyond the classical framework. In particular, we establish convergence guarantees under the (L0, L1)-Lipschitz condition and derive new step-size rules that expand the range of provably convergent regimes. We also introduce Polyak-type step sizes for deterministic and stochastic extragradient methods, leading to adaptive variants with favourable theoretical properties and practical performance. Our results focus primarily on monotone problems, with extensions to selected structured non-monotone settings. We conclude with numerical experiments that illustrate the theory and the empirical behaviour of the proposed methods.
Bio:
Nicolas Loizou is an Assistant Professor in the Department of Applied Mathematics and Statistics and the Mathematical Institute for Data Science (MINDS) at Johns Hopkins University, where he leads the Optimization and Machine Learning Lab. He holds secondary appointments in the Departments of Computer Science and Electrical and Computer Engineering and is a member of Johns Hopkins Data Science Institute and Ralph O’Connor Sustainable Energy Institute (ROSEI).
Prior to this, he was a Postdoctoral Research Fellow at Mila - Quebec Artificial Intelligence Institute and the University of Montreal. He holds a Ph.D. in Optimization and Operational Research from the University of Edinburgh, School of Mathematics, an M.Sc. in Computing from Imperial College London, and a BSc in Mathematics from the National and Kapodistrian University of Athens.
His research interests include large-scale optimization, machine learning, randomized numerical linear algebra, distributed and decentralized algorithms, algorithmic game theory, and federated learning. He currently serves as action editor for Information and Inference: A Journal of the IMA, Optimization Methods and Software, and Transactions on Machine Learning Research. He has received several awards and fellowships, including the OR Society's 2019 Doctoral Award (runner-up) for the ''Most Distinguished Body of Research leading to the Award of a Doctorate in the field of Operational Research’', the IVADO Fellowship, the COAP 2020 Best Paper Award, the CISCO 2023 Research Award, and the Catalyst 2025 Award.
14:15
L^2 and twistor metrics for hyperbolic monopoles
Abstract
This talk will present a new approach to the geometry of moduli spaces of hyperbolic monopoles. It is well-known that the L^2 metric on the moduli space of hyperbolic monopoles, defined using a Coulomb gauge fixing condition, diverges. Recently we have shown that a supersymmetry-inspired gauge-fixing condition cures this divergence, resulting in a pluricomplex geometry that generalises the hyperkaehler geometry of euclidean monopole moduli spaces. We will compare this with metrics introduced by Nash and Bielawski—Schwachhofer, and present explicit calculations of both metrics for charge 2 monopoles.
Renormalisation group on Lorentzian manifolds using (p)AQFT
Abstract
I will start the talk by discussing renormlisation group in perturbative algebraic quantum field theory (pAQFT) and its non-perturbative incarnation acting on the Buchholz-Fredenhagen dynamical C*-algebra. I will also explain how pAQFT can be used to derive functional renormlisation group (FRG) equations that generalize Wetterich equations to globally hyperbolic Lorentzian manifolds and arbitrary states (beyond the usual FRG in the vacuum).