Skip to main content
Mathematics LibreTexts

11.6: The Probabilistic Method

  • Page ID
    97939
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    At the outset of this chapter, we presented Erdős' original proof for the lower bound for the Ramsey number \(R(n,n)\) using counting. Later, we recast the proof in a probabilistic setting. History has shown that this second perspective is the right one. To illustrate the power of this approach, we present a classic theorem, which is also due to Erdős, showing that there are graphs with large girth and large chromatic number.

    The girth \(g\) of a graph \(G\) is the smallest integer for which \(G\) contains a cycle on \(g\) vertices. The girth of a forest is taken to be infinite, while the girth of a graph is three if and only if it has a triangle. You can check the families of triangle-free, large chromatic number, graphs constructed in Chapter 5 and see that each has girth four.

    Theorem 11.7. Erdõs

    For every pair \(g,t\) of integers with \(g \geq 3\), there exists a graph \(G\) with \(\chi (G)>t\) and the girth of \(G\) greater than \(g\).

    Proof

    Before proceeding with the details of the argument, let's pause to get the general idea behind the proof. We choose integers \(n\) and \(s\) with \(n>s\), and it will eventually be clear how large they need to be in terms of \(g\) and \(t\). We will then consider a random graph on vertex set \(\{1,2,…,n\}\), and just as before, for each \(i\) and \(j\) with \(1≤i<j≤n\), the probability that the pair \(ij\) is an edge is \(p\), but now \(p\) will depend on \(n\). Of course, the probability that any given pair is an edge is completely independent of all other pairs.

    Our first goal is to choose the values of \(n, s\) and \(p\) so that with high probability, a random graph does not have an independent set of size \(s\). You might think as a second goal, we would try to get a random graph without small cycles. But this goal is too restrictive. Instead, we just try to get a graph in which there are relatively few small cycles. In fact, we want the number of small cycles to be less than \(n/2\). Then we will remove one vertex from each small cycles, resulting in a graph with at least \(n/2\) vertices, having no small cycles and no independent set of size \(s\). The chromatic number of this graph is at least \(n/2s\), so we will want to have the inequality \(n>2st\).

    Now for some details. Let \(X_1\) be the random variable that counts the number of \(s\)-element independent sets. Then

    \(E(X_1) = \dbinom{n}{s} (1-p)^{C(s,2)}\)

    Now we want \(E(X_1)<1/4\). Since \(C(n,s)≤n^s=e^{s \ln ⁡n}\) and \((1−p)^{C(s,2)}≤e^{−ps^2/2}\), it suffices to set \(s=2 \ln ⁡n/p\). By Markov's Inequality, the probability that \(X_1\) exceeds \(1/2≥2E(X_1)\) is less than 1/2.

    Now let \(X_2\) count the number of cycles in \(G\) of size at most \(g\). Then

    \(E(X_2) \leq \displaystyle \sum_{i=3}^g n(n-1)(n-2) ... (n-i+1)p^i \leq g(pn)^g\).

    Now, we want \(E(X_2)≤n/4\), and an easy calculation shows that \(g(np)^g≤n/4\) when \(p=n^1/g^{−1}/10\). Again by Markov's Inequality, the probability that \(X_2\) exceeds \(n/2≥2E(X_2)\) is less than 1/2.

    We conclude that there is a graph \(G\) for which \(X_1=0\) and \(X_2≤n/2\). Remove a vertex from each of the small cycles in \(G\) and let \(H\) be the graph that remains. Clearly, \(H\) has at least \(n/2\) vertices, no cycle of size at most \(g\) and no independent set of size \(s\). Finally, the inequality \(n>2st\) requires \(n^{1/g}/(40 \ln n) > t\).

    11.6.1 Gaining Intuition with the Probabilistic Method

    Experienced researchers are able to simplify the calculations in an argument of this type, as they know what can safely be discarded and what can not. Here's a quick tour of the essential steps. We want \(E(X_1)\) to be small, so we set \(n^se^{−ps^2}=1\) and get \(s= \ln ⁡n/p\). We want the number of small cycles to be about \(n\) so we set \((gp)^g=n\) and get \(p=n^1/g^{−1}\). Finally, we want \(n=st\) which requires \(n^{1/g}=t\). The rest is just paying attention to details.


    This page titled 11.6: The Probabilistic Method is shared under a CC BY-SA 4.0 license and was authored, remixed, and/or curated by Mitchel T. Keller & William T. Trotter via source content that was edited to the style and standards of the LibreTexts platform; a detailed edit history is available upon request.

    • Was this article helpful?