Mixed-precision explicit stabilized Runge-Kutta methods for single- and multi-scale differential equations

Author: 

Croci, M
Souza, G

Publication Date: 

24 September 2021

Last Updated: 

2021-11-12T09:54:05.067+00:00

abstract: 

Mixed-precision algorithms combine low- and high-precision computations in
order to benefit from the performance gains of reduced-precision without
sacrificing accuracy. In this work, we design mixed-precision
Runge-Kutta-Chebyshev (RKC) methods, where high precision is used for accuracy,
and low precision for stability. Generally speaking, RKC methods are low-order
explicit schemes with a stability domain growing quadratically with the number
of function evaluations. For this reason, most of the computational effort is
spent on stability rather than accuracy purposes. In this paper, we show that a
na\"ive mixed-precision implementation of any Runge-Kutta scheme can harm the
convergence order of the method and limit its accuracy, and we introduce a new
class of mixed-precision RKC schemes that are instead unaffected by this
limiting behaviour. We present three mixed-precision schemes: a first- and a
second-order RKC method, and a first-order multirate RKC scheme for multiscale
problems. These schemes perform only the few function evaluations needed for
accuracy (1 or 2 for first- and second-order methods respectively) in high
precision, while the rest are performed in low precision. We prove that while
these methods are essentially as cheap as their fully low-precision equivalent,
they retain the convergence order of their high-precision counterpart. Indeed,
numerical experiments confirm that these schemes are as accurate as the
corresponding high-precision method.

Symplectic id: 

1201716

Download URL: 

Submitted to ORA: 

Not Submitted

Publication Type: 

Journal Article