Tue, 21 May 2019

14:00 - 14:30
L5

Time-Varying Matrix Problems and Zhang Neural Networks

Frank Uhlig
(Auburn)
Abstract

We adapt convergent look-ahead and backward finite difference formulas to compute future eigenvectors and eigenvalues of piecewise smooth time-varying matrix flows $A(t)$. This is based on the Zhang Neural Network model for time-varying problems and uses the associated error function

$E(t) =A(t)V(t)−V(t)D(t)$

with the Zhang design stipulation

$\dot{E}(t) =−\eta E(t)$.

Here $E(t)$ decreased exponentially over time for $\eta >0$. It leads to a discrete-time differential equation of the form $P(t_k)\dot{z}(t_k) = q(t_k)$ for the eigendata vector $z(t_k)$ of $A(t_k)$. Convergent high order look-ahead difference formulas then allow us to express $z(t_k+1)$ in terms of earlier discrete $A$ and $z$ data. Numerical tests, comparisons and open questions follow.

Subscribe to Auburn