Polynomial root finding - Companion/Colleague/Comrade Matrices

Introduction

Finding the roots of a polynomial can be extremely useful from an optimisation standpoint, where the minima and maxima of a derivative polynomial can be found easily by finding the roots of its polynomial. One of the more efficient root-finding approaches relies on the generation of a matrix representation of the polynomial and finding its eigenvalues. The polynomial basis determines the type of matrix generated.

Simply put:

  1. For a standard polynomial, the eigenvalues of the companion matrix provides the roots.
  2. For a chebyshev basis, the colleague matrix is used.
  3. For orthogonal polynomials in general, eigenvalues of a comrade matrix yields the roots.

1. Companion Matrix

For a standard polynomial defined by a linear combination of monomials, its roots can be determined using the well-known approach of evaluating the eigenvalues of the companion matrix of the polynomial consisting of the coefficients, and the identity matrix. [1, 3]

If we define a standard polynomial, $p(x)$ as described above,

$
p(x) = x^n + a_{n-1}x^{n-1} + a_{n-2}x^{n-2} +\dots + a_{1}x + a_0,
$

its companion matrix is defined by

$C =
\left(
\begin{array}{cc}
0 & -a_0 \
I_{n-1} & -\mathbf{c} \
\end{array} \right){n \times n} =
\left(
\begin{array}{ccccc}
0 & 0 & \dots & 0 & -a_0 \
1 & 0 & \dots & 0 & -a_1 \
0 & 1 & \dots & 0 & -a_2 \
\vdots & \vdots & \ddots & \vdots & \vdots \
0 & 0 & \dots & 1 & -a
{n-1} \
\end{array} \right)_{n \times n}
$

where $\mathbf{c} = (a_1, a_2, \dots, a_{n-1})$.

2. Colleague Matrix

When considering a polynomial Chebyshev basis, the roots can be obtained via the eigenvalues of the colleague matrix, which comprises of the sum of a tridiagonal matrix and an error matrix consisting of the polynomial coefficients [2,3].

For a polynomial defined by

$
p(x) = \sum_{k=0}^{\infty} a_{k} T_{k}(x), \quad a_n \ne 0,
$

the roots are the eigenvalues of:

$C =
\left(
\begin{array}{cccccc}
0 & 1 & & & & \
\frac{1}{2} & 0 & \frac{1}{2} & & & \
& \frac{1}{2} & 0 & \frac{1}{2} & & \
& & \ddots & \ddots & \ddots & \
& & & & & \frac{1}{2} \
& & & & \frac{1}{2} & 0 \
\end{array} \right){n\times n} -
\frac{1}{2a_n}
\left(
\begin{array}{ccccc}
\
\
\
\
\
\
\
a_0 & a_1 & a_2 & \dots & a
{n-1} \
\end{array} \right)_{n\times n}
$

where, $C$ is made up of two parts. The first part relates to the recurrence coefficients, and the second to the coefficients of the polynomial.

3. Comrade Matrix

A more generalised form of the colleague matrix applicable to other families of orthogonal polynomials is known as a comrade matrix, where the first part of the colleague matrix is in terms of the three-term recurrence relations orthogonal polynomials are based on.

An initial implementation of root-finding for Chebyshev basis has been implemented in the following commit:

References

[1] Barnett, S. (1975) “A Companion Matrix Analogue for Orthogonal Polynomials”, Linear Algebra and its Applications, 12(3), bll 197–202. Redirecting.

[2] Barnett, S. (1975) “THE COLLEAGUE MATRIX, A CHEBYSHEV ANALOGUE OF THE COMPANION MATRIX”, The Quarterly Journal of Mathematics, 12(1), bll 61–68. THE COLLEAGUE MATRIX, A CHEBYSHEV ANALOGUE OF THE COMPANION MATRIX | The Quarterly Journal of Mathematics | Oxford Academic.

[3] Trefethen, L. N. (2019) “Approximation Theory and Approximation Practice” Philadelphia, PA, USA: SIAM-Society for Industrial and Applied Mathematics.

1 Like