Subspace Correction Methods for Convex Optimization: Algorithms, Theory, and Applications
Abstract
Speaker Yongho Park will talk about 'Subspace Correction Methods for Convex Optimization: Algorithms, Theory, and Applications'
This talk considers a framework of subspace correction methods for convex optimization, which provides a unified perspective for the design and analysis of a wide range of iterative methods, including advanced domain decomposition and multigrid methods. We first develop a convergence theory for parallel subspace correction methods based on the observation that these methods can be interpreted as nonlinearly preconditioned gradient descent methods. This viewpoint leads to a simpler and sharper analysis compared with existing approaches. We further show how the theory can be extended to semicoercive and nearly semicoercive problems. In addition, we explore connections between subspace correction methods and other classes of iterative algorithms, such as alternating projection methods, through the lens of convex duality, thereby enabling a unified treatment. Several applications are presented, including nonlinear partial differential equations, variational inequalities, and mathematical imaging problems. The talk concludes with a discussion of relevant and emerging research directions.