Date
Thu, 03 Nov 2022
Time
14:00 - 15:00
Location
L3
Speaker
Hussam Al Daas
Organisation
STFC Rutherford Appleton Laboratory

Solving sparse linear systems is omnipresent in scientific computing. Direct approaches based on matrix factorization are very robust, and since they can be used as a black-box, it is easy for other software to use them. However, the memory requirement of direct approaches scales poorly with the problem size, and the algorithms underpinning sparse direct solvers software are poorly suited to parallel computation. Multilevel Domain decomposition (MDD) methods are among the most efficient iterative methods for solving sparse linear systems. One of the main technical difficulties in using efficient MDD methods (and most other efficient preconditioners) is that they require information from the underlying problem which prohibits them from being used as a black-box. This was the motivation to develop the widely used algebraic multigrid for example. I will present a series of recently developed robust and fully algebraic MDD methods, i.e., that can be constructed given only the coefficient matrix and guarantee a priori prescribed convergence rate. The series consists of preconditioners for sparse least-squares problems, sparse SPD matrices, general sparse matrices, and saddle-point systems. Numerical experiments illustrate the effectiveness, wide applicability, scalability of the proposed preconditioners. A comparison of each one against state-of-the-art preconditioners is also presented.

Please contact us with feedback and comments about this page. Last updated on 16 Oct 2022 21:56.