In Bayesian inverse problems, it is common to consider several hyperparameters that define the prior and the noise model that must be estimated from the data. In particular, we are interested in linear inverse problems with additive Gaussian noise and Gaussian priors defined using Matern covariance models. In this case, we estimate the hyperparameters using the maximum a posteriori (MAP) estimate of the marginalized posterior distribution.
However, this is a computationally intensive task since it involves computing log determinants. To address this challenge, we consider a stochastic average approximation (SAA) of the objective function and use the preconditioned Lanczos method to compute efficient function evaluation approximations.
We can therefore compute the MAP estimate of the hyperparameters efficiently by building a preconditioner which can be updated cheaply for new values of the hyperparameters; and by leveraging numerical linear algebra tools to reuse information efficiently for computing approximations of the gradient evaluations. We demonstrate the performance of our approach on inverse problems from tomography.