23 April 2009

14:00

How sharp is the restricted isometry property? An investigation into sparse approximation techniques

Dr. Coralia Cartis

Abstract

A simple, yet efficient, model for data acquisition is to measure signals
$x$ linearly through the action of a measurement matrix $A$ of size
$n\times N$, namely $b = Ax$. Surprisingly, only $n<N$ measurements can
be sufficient to capture $x$. Although an underdetermined system of equations
ensues, practitioners prefer a parsimonious representation of the signal, as
expressed by the sparsest vector with say $k$ nonzero entries which (exactly
or approximately) matches the observed data; however, to find such an $x$,
would generally require an expensive combinatorial search.
Fortunately, when a truly sparse vector matches the data, it has been shown that
the sparsest solution can indeed be found by solving various convex or nonconvex optimization
relaxations or by employing greedy algorithms, provided $A$ satisfies certain conditions.
One such type of requirement on $A$ is the Restricted Isometry Property (RIP),
proposed by Candes, Romberg \& Tao (2004), which assumes that all the singular values of
all $k\times n$ submatrices of $A$ are "close" to one. In this talk, we consider an important
case study in the area, namely when $A$ has iid Gaussian entries and the $(k,n,N)$ dimensions
are growing, and develop the sharpest-known bounds on the extreme singular values of
the submatrices of $A$. These provide us with conditions on $(k,n,N)$ that ensure
$A$ satisfies the RIP, and hence several optimization relaxation techniques and greedy algorithms
are guaranteed to recover the sparsest solution of $Ax=b$, for any $b$ generated by a sparse vector.
In particular, the requirements on $(k,n,N)$ are all of the form $k\sim n$ so that as many
measurements are required as the sparsity level of the solution, where the constant of proportionality
is specific to the particular method of recovery and hence gives an indication of the
efficiency or otherwise of each technique.
This is joint work with J. Blanchard (U of Utah), J. Tanner and A. Thompson (U of Edinburgh).