Expected number of cycles of permutation equals harmonic number of degree
Contents
Statement
Suppose is a natural number. Consider the uniform distribution on the symmetric group of degree . Then, the expected number of cycles in the cycle decomposition of a permutation chosen according to the uniform distribution is equal to the harmonic number of , where:
Note that , where is the Euler-Mascheroni constant, and its value is approximately (or very close to ). Also, . Thus, for large enough, can be approximated additively as and multiplicatively as .
See also probability distribution of number of cycles of permutations.
Particular cases
(equals expected number of cycles) | ||
---|---|---|
1 | 1 | 0 |
2 | 1.5 = 3/2 | 0.6931... |
3 | 1.833.. = 11/6 | 1.0986... |
4 | 2.0833.. = 25/12 | 1.3862... |
5 | 2.2833.. = 137/60 | 1.6094... |
Relation with Stirling numbers of the first kind
Denote by the unsigned Stirling number of the first kind, defined as the number of permutations of with cycles. The expected number of cycles of a permutation can then be computed as:
The result of this page therefore says that:
Proof
For the prooof, we will work with the symmetric group on the set for concreteness. However, the result applies to the expected number of cycles for the symmetric group on any set of size .
Review of the Foata transformation
The standard proof of the statement uses the Foata transformation. The Foata transformation is a bijection from to itself that relies on the "pun" between cycle decompositions and one-line notation. The steps are as follows:
- Write the permutation in canonical cycle decomposition notation: By the cycle decomposition theorem, the permutation has a cycle decomposition. Write the cycle decomposition in canonical notation, including fixed points. Make sure to include fixed points as cycles of size one. Explicitly, this means:
- Cyclically rearrange each cycle so that it begins with its largest element.
- Arrange the cycles in increasing order of their largest elements.
- Reinterpret in one-line notation: Now, drop the commas and parentheses and you get a string of length that is the one-line notation for a permutation of (more explicitly, it is the second line of the two-line notation of the permutation of , where the first line is ). This is the new permutation that is the image of the Foata transformation.
The inverse of this bijection is as follows:
- Write the permutation in one-line notation (i.e., write it as a string whose element is the image of under the permutation).
- Traverse through the string from left to right, and identify all the left-to-right maxima, i.e., all the elements that are larger than the elements before them. The strings starting with a given left-to-right maxima and ending just before the next form the strings for the cycles in the canonical notation for the cycle decomposition of the original permutation.
Counting the number of cycles using the Foata transformation
From the above, we obtain that:
Number of cycles of a permutation = Number of left-to-right maxima in the image of that permutation under the Foata transformation
Applying this to the uniform probability distribution over , we obtain:
Expected number of cycles of a permutation = Expected number of left-to-right maxima of a permutation in one-line notation
By the additivity of expectation, the right side can be simplified as:
(Expected number of left-to-right maxima at the position)
The summands are now simply probabilities. So the summation becomes:
(Probability that the entry in the position of a permutation is a left-to-right maximum)
This probability is : intuitively, what's happening is that, once we decide what elements to choose for the first , then exactly of the permutations with those elements will have the largest among them in the last ( position). Thus, we get that the expected number of cycles in a permutation is:
Comment on independence of probabilities
The expected number of left-to-right maxima is the sum of the probabilities, for each position, that the entry in that position is a left-to-right maximum. In order to make this observation, we do not need to demonstrate that the probabilities are independent of one another. However, it turns out that the probabilities are indeed independendent of one another.