Date
Thu, 19 Jan 2023
Time
14:00 - 15:00
Location
L3
Speaker
Misha Kilmer
Organisation
Tufts University

Tensors, also known as multiway arrays, have become ubiquitous as representations for operators or as convenient schemes for storing data. Yet, when it comes to compressing these objects or analyzing the data stored in them, the tendency is to ``flatten” or ``matricize” the data and employ traditional linear algebraic tools, ignoring higher dimensional correlations/structure that could have been exploited. Impediments to the development of equivalent tensor-based approaches stem from the fact that familiar concepts, such as rank and orthogonal decomposition, have no straightforward analogues and/or lead to intractable computational problems for tensors of order three and higher.

In this talk, we will review some of the common tensor decompositions and discuss their theoretical and practical limitations. We then discuss a family of tensor algebras based on a new definition of tensor-tensor products. Unlike other tensor approaches, the framework we derive based around this tensor-tensor product allows us to generalize in a very elegant way all classical algorithms from linear algebra. Furthermore, under our framework, tensors can be decomposed in a natural (e.g. ‘matrix-mimetic’) way with provable approximation properties and with provable benefits over traditional matrix approximation. In addition to several examples from recent literature illustrating the advantages of our tensor-tensor product framework in practice, we highlight interesting open questions and directions for future research.

Last updated on 21 Dec 2022, 3:16pm. Please contact us with feedback and comments about this page.