# Interlacing via rational functions and spectral decomposition

First of all, I’d like to express my sympathies to everyone who is enduring hardships due to COVID-19. Stay well and be strong.

In this previous post, I discussed two important classical results giving examples of polynomials whose roots interlace:

Theorem 1: The roots of a real-rooted polynomial and its derivative interlace.

Theorem 2: (Cauchy’s interlacing theorem) The eigenvalues of a real symmetric matrix interlace with those of any principal minor.

In this post, I’d like to explain a general method, based on partial fraction expansions of rational functions, which gives a unified approach to proving Theorems 1 and 2 and deserves to be better known.

Interlacing and rational functions

Suppose $p(x) = \prod_{j=1}^d (x - \alpha_j)$ and $q(x) = \prod_{j=1}^{d-1} (x - \beta_j)$ are real-rooted polynomials. We say that the roots of $p(x)$ and $q(x)$ interlace if $\alpha_1 \leq \beta_1 \leq \alpha_2 \leq \cdots \leq \beta_{d-1} \leq \alpha_d,$ and that they strictly interlace if strict inequalities hold throughout.

Theorem 3: Suppose $\gamma > 0$ and $f(x) = \gamma \frac{q(x)}{p(x)}$, where $p(x)$ and $q(x)$ are real-rooted polynomials of degree $d$ and $d-1$, respectively. Then the roots of $p(x)$ and $q(x)$ interlace (rest. strictly interlace) if and only if $f(x) = \sum_j \frac{\lambda_j}{x - \alpha_j}$ with $\lambda_j \geq 0$ (resp. $\lambda_j > 0$) for all $j$.

Proof (Sketch): We prove the strict case, assuming that the zeros of $p(x)$ are simple; the general case follows by a continuity argument. Let $\sum_j \frac{\lambda_j}{x - \alpha_j}$ be the partial fractions decomposition of $f(x)$, and assume that $\alpha_1 < \alpha_2 < \ldots < \alpha_d$. If each $\lambda_i > 0$ then $f'(x) < 0$ for all $x \not\in \{ \alpha_1,\ldots,\alpha_d \}$. Since $\lim_{x \to \alpha_j^+} f(x) = +\infty$, $\lim_{x \to \alpha_j^-} f(x) = -\infty$ for all $j$, and $\lim_{x \to -\infty} f(x)=\lim_{x \to +\infty} f(x) = 0,$ it follows that $f(x)$ has a unique root in $(\alpha_{j-1},\alpha_j)$ for all $j=2,\ldots,d$ and no other roots. Hence the zeros of $p(x)$ and $q(x)$ interlace. The converse is proved similarly.

Figure 1: In the left-hand plot, the zeros of p(x) and q(x) interlace. In the right-hand plot, they do not.

Interlacing of the roots of $p(x)$ and $p'(x)$

Theorem 1, which asserts that the roots of $p(x)$ and its derivative $p'(x)$ interlace, is usually proved using Rolle’s theorem. We establish it here as a consequence of Theorem 3:

Proof of Theorem 1: Let $f(x) = \frac{p'(x)}{p(x)}.$ Then $f(x) = d\log p(x)$, so if $p(x) = \prod_{j=1}^r (x - \alpha_j)^{m_j}$ with each $m_j$ a positive integer, then $f(x) = \sum_{j=1}^r \frac{m_j}{x - \alpha_j}$. By Theorem 3, the zeros of $p(x)$ and $p'(x)$ interlace.

Spectral decomposition and the functional calculus

If $A$ is an $n \times n$ real symmetric matrix and $\sigma(A)$ is the set of eigenvalues of $A$, then $A$ has the well-known spectral decomposition $A = \sum_{\lambda \in \sigma(A)} \lambda E_\lambda,$ where $E_\lambda$ is the principal idempotent corresponding to orthogonal projection onto the eigenspace of $\lambda$.

More generally, we have the following result which is a special case of the so-called holomorphic functional calculus:

Theorem 4: If $f(x) = \frac{q(x)}{p(x)}$ is a rational function which does not have a pole at any eigenvalue of $A$ then $f(A) := \sum_{\lambda \in \sigma(A)} f(\lambda) E_\lambda$ satisfies $f(A) p(A) = q(A)$.

If $f(x)=q(x)$ is a polynomial then Theorem 4 is proved in a number of standard linear algebra texts, and the general case follows by a suitable limiting argument.

In particular, applying Theorem 4 to $f(x) = \frac{1}{c - x}$ we obtain the formula $(cI - A)^{-1} = \sum_\lambda \frac{E_\lambda}{c - \lambda}.$ Since this is true for all $c \neq \lambda$, we obtain the following identity:

Corollary 5 (Spectral decomposition of the resolvent): $(xI - A)^{-1} = \sum_\lambda \frac{(E_\lambda)}{x - \lambda}.$

Cauchy’s interlacing theorem

Let $A$ be an $n \times n$ real symmetric matrix, and let $B$ be the principal minor obtained by deleting row $i$ and column $i$ from $A$. Let $\phi_A(x)$ (resp. $\phi_B(x)$) denote the characteristic polynomial of $A$ (resp. $B$). By the cofactor formula for the inverse of a matrix, we have:

Lemma 6: $(xI - A)^{-1}_{ii} = \frac{\phi_B(x)}{\phi_A(x)}.$

Proof of Theorem 2: By Corollary 5 and Lemma 6, we have

$(xI - A)^{-1}_{ii} = e_i^T (xI - A)^{-1} e_i = \sum_\lambda \frac{e_i^T E_\lambda e_i}{x - \lambda},$

where $e_i$ denotes the $i^{\rm th}$ standard basis vector. Since

$e_i^T E_\lambda e_i = \langle E_\lambda e_i, e_i \rangle = \langle E_\lambda^2 e_i, e_i \rangle = \| E_\lambda e_i \|^2 \geq 0,$

the result now follows from Theorem 3.

Concluding remarks:

1. I learned Theorem 3 from this paper by McKee and Smyth, and I learned the proof of Theorem 2 that we’ve given here from Section 8 of the book Algebraic Graph Theory by Godsil and Royle.

2. For more details on the holomorphic functional calculus (which can also be used, for example, to define ${\rm ln}(A)$ and $\sqrt{A}$ when $A$ has positive eigenvalues) and the spectral decomposition of the resolvent, see e.g. these notes by Rodica Costin or this Wikipedia page.