Analytic Combinatorics

# Chapter 13: Analytic Combinatorics #

If what one is doing looks out upon the infinite... then one works with more serenity.

— Vincent Van Gogh

Put thyself into the trick of singularity.

— William Shakespeare (Twelfth Night)

In our study of C-finite sequences, we saw how properties of their rational generating functions as complex-valued functions (locations and multiplicities of the denominator roots) translated directly into asymptotic information. Using more advanced techniques from complex analysis, these observations can be greatly generalized.

## Radius of Convergence and Exponential Growth #

Let $F(z) = \sum_{n \geq 0} f_nz^n$ be a power series, which we now consider as a function from the complex numbers to the complex numbers, for all values of $z$ where the series converges. In order to characterize the values of $z$ where $F(z)$ converges, we must first introduce some notation.

A subsequence of $(f_n)$ is any sequence $(f_{k_n})$ with $0 \leq k_1 < k_2 < k_3 < \cdots$ (informally, a subsequence is obtained by selecting some terms of $(f_n)$ when running through the sequence). For instance, $(f_n)$ is a subsequence of itself – other subsequences include the terms $f_{2n}$ with even indices, and the terms with prime indices.

The limit superior $\limsup_{n\rightarrow\infty}x_n$ of any sequence $(x_n)$ with positive real terms is the supremum of all limits of all subsequences of $(x_n).$

Unlike limits, which may not exist because sequences can oscillate between different values, limit superiors always exist (although they may equal infinity).

Example: If $f_n = 1$ when $n$ is even and $f_n = 2 - 2^{-n}$ when $n$ is odd, then the subsequences of $f_n$ that converge have limits equal to $1$ or $2,$ so the limit superior of $f_n$ is 2.

The following result is usually presented in a first analysis course.

Theorem (Root Test): Let $\displaystyle\rho = \limsup_{n \rightarrow \infty} |a_n|^{1/n}$ and define $$R = \begin{cases} 1/\rho &\text{ if } 0 < \rho < \infty \\[+2mm] \infty &\text{ if } \rho = 0 \\[+2mm] 0 &\text{ if } \rho = \infty \end{cases}$$ Then $F(z)$ converges whenever $|z|<R$ and diverges if $|z|>R.$

We call the value $R$ computed with the root test the radius of convergence of $F(z).$ The root test is closely related to the ratio test, which is commonly seen in a first calculus course.

Theorem (Ratio Test): If $\lim_{n \rightarrow\infty} \left|\frac{f_{n+1}}{f_n}\right|$ exists then it equals the value $\rho$ computed in the root test.

Examples: $\bullet$ Since $\lim_{n\rightarrow\infty} (2^n)^{1/n}=2,$ the radius of convergence of $$\frac{1}{1-2z} = \sum_{n \geq 0} 2^nz^n$$ is $1/2.$

$\bullet$ Since $\lim_{n\rightarrow\infty} \frac{(n+1)!}{n!}=\infty,$ the radius of convergence of $$\sum_{n \geq 0} n!z^n$$ is $0.$

$\bullet$ Since $\lim_{n\rightarrow\infty} \frac{1/(n+1)}{1/n}=1,$ the radius of convergence of $$-\log(1-z) = \sum_{n \geq 0} \frac{z^n}{n}$$ is $1.$

$\bullet$ Since $\lim_{n\rightarrow\infty} \frac{1/(n+1)!}{1/n!}=0,$ the radius of convergence of $$e^z = \sum_{n \geq 0} \frac{z^n}{n!}$$ is infinity.

We now restrict to the case of a series $F(z)$ with finite positive radius of convergence $R = 1/\rho.$ In this case we can write $$f_n = \rho^n \cdot \theta(n) + O(\alpha^n)$$ for some function $\theta$ that grows slower than any exponential function and $0 < \alpha < \rho.$

We call $\rho^n$ the exponential growth of $(f_n)$ and $\theta(n)$ the subexponential growth of $(f_n).$ Under our conditions, the exponential growth forms the most impactful part of the asymptotic behaviour of $f_n.$

## Singularities #

We have now linked the exponential growth of the sequence $(f_n)$ to the radius of convergence of its generating function $F(z).$ In the examples above it was possible to determine the radius of convergence directly from a closed expression for $(f_n),$ but what if such an expression is not known?

The key to analytic combinatorics is that the radius of convergence (and much more) can be deduced directly from the properties of $F(z)$ as a complex-valued function. In fact, we have already seen this in a special case.

Example: If $F(z) = \frac{G(z)}{H(z)}$ is a rational function defined by the ratio of coprime polynomials $G$ and $H$ then the C-finite Coefficient Theorem implies that the exponential growth of $(f_n)$ is the minimum value of $|w|^{-1}$ as $w$ ranges over the roots of the denominator $H.$

Generalizing beyond the rational case requires defining the singularities of a complex function. Roughly, singularities are points where “something goes wrong” in a function, such as division by zero, putting zero into a root, or putting zero into a logarithm. A rigorous discussion of singularities requires discussing analytic continuation of complex functions, which we do not get into here.

Examples: $\bullet$ The only singularity of $F(z) = 1/(1-2z)$ is $z=1/2.$

$\bullet$ The singularities of $F(z) = 1/(1+z^2)$ are $z=i$ and $z=-i.$

$\bullet$ The EGF of permutations with no fixed points is $F(z) = e^{-z}/(1-z)$ which has its only singularity at $z=1.$

$\bullet$ The Catalan generating function $C(z) = \frac{1-\sqrt{1-4z}}{2z}$ has a singularity at $z=1/4$ (where there is a zero inside the square-root). Note that $z=0$ might appear to be a singularity due to a division by zero, but the fact that the numerator of $C$ also vanishes at $z=0$ implies that this “apparent singularity” can be “removed”.

The following (somewhat amazing) result follows directly from the Cauchy Integral Formula, a standard result in complex analysis.

Theorem (Cauchy): If $F(z) = \sum_{n \geq 0}f_nz^n$ has a finite positive radius of convergence $R$ then $R$ equals the minimum modulus of the singularities of $F$ over the complex numbers.

Because we deal with generating functions, which have non-negative coefficients, we can be more specific about where to find singularities.

Vivanti-Pringsheim Theorem: If $F(z) = \sum_{n \geq 0}f_nz^n$ and $f_n \geq 0$ for all $n$ then the radius of convergence $R$ of $(f_n)$ is a minimal modulus singularity of $F(z).$

Thus, the exponential growth of $(f_n)$ can be deduced immediately from the first positive real singularity of $F(z).$

Examples: $\bullet$ If $F(z) = 1/(1-2z)$ then $f_n$ has exponential growth $2^n.$

$\bullet$ If $F(z) = e^{-z}/(1-z)$ then $f_n$ has exponential growth $1^n = 1.$

$\bullet$ If $F(z) = \frac{1-\sqrt{1-4z}}{2z}$ then $f_n$ has exponential growth $4^n.$ In fact, using our closed formula for the $n$th Catalan number and Stirling’s approximation, it is possible to show that $f_n \sim 4^n/\sqrt{n^3\pi}.$

$\bullet$ Recall that the EGF of surjections is $F(z)=1/(2-e^z).$ Since $z=\log 2$ is the first (and only) positive real singularity of $F$ the exponential growth of $f_n$ is $(\log 2)^{-n} = (1.442\dots)^n.$ Note that $F$ has an infinite number of singularities $\{\log 2 + k(2\pi i) : k \in \mathbb{Z} \}$ in the complex plane.

The singularities of $F$ closest to the origin are called its dominant singularities, as they are the ones that dictate the dominant asymptotic behaviour of its coefficient sequence.

## Meromorphic Asymptotics #

Although a general discussion of singularities is out of the scope of these notes, there is one important case where we can be more explicit. We say that a complex function $F(z)$ is analytic at a point $z=a$ if its Taylor series $$F(z) = \sum_{n \geq 0} \frac{F^{(n)}(a)}{n!}(z-a)^n$$ around $z=a$ exists and has a positive radius of convergence. A singularity of $F$ is roughly a point where $F$ is not analytic, however as mentioned above a fully rigorous definition of singularities also requires the notion of analytic continuation.

Examples: $\bullet$ The functions $e^z, \sin(z),$ and $\cos(z),$ and all polynomials, are analytic in the entire complex plane

$\bullet$ If $p(z)$ is analytic at $z=a$ and $p(a) \neq 0$ then $\sqrt{p(z)},$ $1/p(z),$ and $\log p(z)$ are analytic at $z=a$

$\bullet$ If $f(z)$ and $g(z)$ are analytic at $z=a$ then $f(z)g(z)$ and $f(z) + g(z)$ are analytic at $z=a.$

$\bullet$ If $g(z)$ is analytic at $z=a$ and $f(z)$ is analytic at $z=g(a)$ then $f(g(z))$ is analytic at $z=a.$

A ratio of analytic functions behaves “like” a rational function near each of its singularities, which means that they have similar asymptotic behaviour. The following result follows from the Cauchy Residue Theorem in complex analysis.

Meromorphic Asymptotic Theorem: Suppose that $F(z) = G(z)/H(z)$ is the ratio of complex functions $G$ and $H$ that are analytic in $|z| \leq B$ for some $B > 0.$ If

$\bullet$ $H(z)$ has a single zero $z=w$ with $|z|<B$ and no zeroes with $|z|=B,$ and

$\bullet$ both $H^\prime(w) \neq 0$ and $G(w) \neq 0,$

then $$[z^n]F(z) = w^{-n} \frac{-G(w)}{wH^\prime(w)} + O(B^{-n}).$$

Notes: $\bullet$ When $F$ is a rational function then this result matches the C-finite Asymptotic Theorem (where the denominator has a unique root of minimal modulus, with multiplicity one).

$\bullet$ It is possible to define the multiplicity of a zero of an analytic function using derivatives, and generalize the Meromorphic Asymptotic Theorem beyond simple zeroes.

$\bullet$ If $H$ has a finite set of zeroes with $|z|<B$ and no zeroes with $|z|=B$ then asymptotics are obtained by adding up contributions from the zeroes with $|z|<B.$

Example: Find asymptotics for the number $s_n$ of surjections of size $n.$

As recalled above the EGF for surjections is $F(z) = 1/(2-e^z).$ Both of the functions $G(z)=1$ and $H(z)=2-e^z$ are analytic everywhere in the complex plane, and when $|z| \leq 1$ the only root of $H(z)$ is $z=\log 2.$ Since $G(\log 2) = 1$ and $H^\prime(\log 2) = -2^{\log 2} = -\log 2$ are non-zero, the Meromorphic Asymptotic Theorem implies $$[z^n] F(z) = \frac{1}{2\log 2} (\log 2)^{-n} + O(1),$$ so that $$s_n \sim \frac{n!}{2(\log 2)^{n+1}}.$$

Example: Fix a positive integer $r$ and find asymptotics for the probability that a permutation of size $n$ has no cycles of length $\leq r.$

As seen in previous chapters, the labelled specification $\mathcal{P} = \text{SET}(\text{CYC}_{>r}(\mathcal{Z}))$ leads to the exponential generating function $$P(z) = \frac{e^{-z-z^2/2-\cdots-z^r/r}}{1-z}$$ for the counting sequence $p_n$ of permutations with no cycles of length $\leq r.$ Applying the Meromorphic Asymptotic Theorem gives that the desired probability is $$\frac{n![z^n] P(z)}{n!} = [z^n] P(z) \sim e^{-1-1/2-\cdots-1/r} .$$

Example: The labelled class $\mathcal{A}$ of alignments has the labelled specification $\mathcal{A} = \text{SEQ}(\text{CYC}(\mathcal{Z})).$ Find asymptotics for the number of alignments of size $n.$

This specification implies $$A(z) = \frac{1}{1+\log(1-z)},$$ which has singularities when $z=1$ (0 in a logarithm) and $\log(1-z)=-1$ (divide by zero). The equation $\log(1-z)=-1$ has solution $z=1-e^{-1} =0.632\dots$ which is the only dominant singularity of $A(z).$ Applying the Meromorphic Asymptotic Theorem then gives $$a_n = n![z^n]A(z) \sim \frac{n!}{e(1-e^{-1})^{n+1}}.$$
Exercise: Recall that the exponential generating function of alternating permutations is $T(z) = \tan(z).$ Use the Meromorphic Asymptotic Theorem to find asymptotics for the number of alternating permutations.

## The Principles of Analytic Combinatorics #

The two basic principles of analytic combinatorics are

• First Principle of Analytic Combinatorics: The locations of the singularities of a generating function determine the exponential growth of its coefficient sequence.

• Second Principle of Analytic Combinatorics: The type of the singularities of a generating function determine the sub-exponential growth of its coefficient sequence.

The type of a singularity includes dividing by zero (and with what multiplicity), putting zero in a root, zero in a logarithm, etc.

Like the C-finite Coefficient Theorem, and Meromorphic Asymptotic Theorem, one “transfers” singular behaviour of $F(z)$ near its dominant singularities to dominant asymptotic behaviour.

Very general results are possible, and the interested reader is steered towards the textbooks Analytic Combinatorics and An Invitation to Analytic Combinatorics for more details.