Date
Thu, 21 Jun 2001
Time
14:00 - 15:00
Location
Comlab
Speaker
Prof Gilbert Strang
Organisation
MIT

Tridiagonal matrices and three term recurrences and second order equations appear amazingly often, throughout all of mathematics. We won't try to review this subject. Instead we look in two less familiar directions.

\\

\\

Here is a tridiagonal matrix problem that waited surprisingly long for a solution. Forward elimination factors T into LDU, with the pivots in D as usual. Backward elimination, from row n to row 1, factors T into U_D_L_. Parlett asked for a proof that diag(D + D_) = diag(T) + diag(T^-1).^-1. In an excellent paper (Lin Alg Appl 1997) Dhillon and Parlett extended this four-diagonal identity to block tridiagonal matrices, and also applied it to their "Holy Grail" algorithm for the eigenproblem. I would like to make a different connection, to the Kalman filter.

\\

\\

The second topic is a generalization of tridiagonal to "tree-diagonal". Unlike the interval, the tree can branch. In the matrix T, each vertex is connected only to its neighbors (but a branch point has more than two neighbors). The continuous analogue is a second order differential equation on a tree. The "non-jump" conditions at a meeting of N edges are continuity of the potential (N-1 equations) and Kirchhoff's Current Law (1 equation). Several important properties of tridiagonal matrices, including O(N) algorithms, survive on trees.

Please contact us with feedback and comments about this page. Last updated on 03 Apr 2022 01:32.