For a given positive measure on a fixed domain, Gaussian
quadrature routines can be defined via their property of integrating an
arbitrary polynomial of degree exactly using only quadrature
nodes. In the special case of Gauss–Jacobi quadrature, this means that whenever is a
polynomial of degree at most . When is not a polynomial, but a
function analytic in a neighbourhood of , the above is not an equality
but an approximation that converges exponentially fast as is increased.
An undergraduate mathematician taking a numerical
analysis course could tell you that the nodes are roots of the Jacobi
polynomial , the degree polynomial orthogonal
with respect to the given weight, and that the quadrature weight at each node
is related to the derivative . However, these
values are not generally known in closed form, and we must compute them
numerically... but how?
Traditional approaches involve applying the recurrence
relation satisfied by the orthogonal polynomials, or solving the Jacobi matrix
eigenvalue problem in the algorithm of Golub and Welsch, but these methods are
inherently limited by a minimal complexity of . The current
state-of-the-art is the algorithm developed by Glasier, Liu, and
Rokhlin, which hops from root to root using a Newton iteration evaluated with a
Taylor series defined by the ODE satisfied by .
We propose an alternative approach, whereby the function
and derivative evaluations required in the Newton iteration are computed
independently in operations per point using certain well-known asymptotic
expansions. We shall review these expansions, outline the new algorithm, and
demonstrate improvements in both accuracy and efficiency.