Structured matrix approximations via tensor decompositions

11 March 2021

We provide a computational framework for approximating a class of structured matrices (e.g., block Toeplitz, block banded). Our approach has three steps: map the structured matrix to tensors, use tensor compression algorithms, and map the compressed tensors back to obtain two different matrix representations --- sum of Kronecker products and block low-rank format. The use of tensor decompositions enable us to uncover latent structure in the matrices and lead to computationally efficient algorithms. The resulting matrix approximations are memory efficient, easy to compute with, and preserve the error due to the tensor compression in the Frobenius norm. While our framework is quite general, we illustrate the potential of our method on structured matrices from three applications: system identification, space-time covariance matrices, and image deblurring.

Joint work with Misha Kilmer (Tufts University)



A link for this talk will be sent to our mailing list a day or two in advance.  If you are not on the list and wish to be sent a link, please contact

  • Computational Mathematics and Applications Seminar