Quadratures, parameter and poly

A list of developments that need to be incorporated:

  1. High order Gauss quadrature rules (to deal with the errors from Golub-Welsch) (see Hale and Townsend 2013).
  2. Incorporation of polynomial root finding methods in Parameter.py for the orthogonal polynomial, and for the polynomial approximation in Polynomial.py.
  3. Extensive testing and validation of the gradient methodologies in Polynomial.py and Solver.py.

Perhaps a little justification for the first point above is in order (pun not intended). Consider the standard problem of integrating $f(x) = sin(x)$ over $[-1, 1]$ using Gauss-Legendre quadrature. The standard $L_2$ norm error is plotted on the vertical axes, while the order is shown on the horizontal axes in the figure below.

Figure: Error in quadrature

Part of the issue here is down to my original sloppy implementation of the Golub-Welsch algorithm: using numpy’s eigenvalue solver. If we replace this with scipy’s we observe better results.

Figure: Error in quadrature again

The amended lines of code in Parameter.py are given below:

def get_jacobi_eigenvectors(self, order=None):
    """
    Computes the eigenvectors of the Jacobi matrix.

    :param Parameter self:
        An instance of the Parameter object.
    :param int order:
        Order of the recurrence coefficients.
    """
    if order is None:
        order = self.order + 1
        JacobiMat = self.get_jacobi_matrix(order)
        if order == 1:
            V = [1.0]
    else:
        D, V = sc.linalg.eigh(JacobiMat)
        idx = D.argsort()[::-1]
        eigs = D[idx]
        eigVecs = V[:, idx]
    return eigVecs

def get_local_quadrature(self, order=None, ab=None):
    if order is None:
        order = self.order + 1
    else:
        order = order + 1

    if ab is None:
        # Get the recurrence coefficients & the jacobi matrix
        JacobiMat = self.get_jacobi_matrix(order)
        ab = self.get_recurrence_coefficients(order+1)
    else:
        ab = ab[0:order+1,:]
        JacobiMat = self.get_jacobi_matrix(order, ab)
    # If statement to handle the case where order = 1
    if order == 1:
        # Check to see whether upper and lower bound are defined:
        if not self.lower or not self.upper:
            p = np.asarray(self.distribution.mean).reshape((1,1))
        else:
            p = np.asarray((self.upper - self.lower)/(2.0) + self.lower).reshape((1,1))
        w = [1.0]
    else:
        D, V = sc.linalg.eigh(JacobiMat)
        idx = D.argsort()[::-1]
        eigs = D[idx]
        eigVecs = V[:, idx]

        w = np.linspace(1,order+1,order) # create space for weights
        p = np.ones((int(order),1))
        for u in range(0, len(idx) ):
            w[u] = float(ab[0,1]) * (eigVecs[0,idx[u]]**2) # replace weights with right value
            p[u,0] = eigs[u]
    return p, w

But this too is far from perfect! However, as pointed out in Hale and Townsend 2013, the only way to improve the accuracy with increasing orders is to address the instability that arises when using the standard Golub-Welsch algorithm.

The code to generate the first figure above can be found on EQ-live. The new feature pull request can be found here:

As part of the Google summer of code 2021, it would be good to revive this thread and also verify some of the other quadrature rules we have.

index

1 Like