Kolmogorov's Three-Series Theorem

Last updated: 2026-04-12

The Kolmogorov's Three Series Theorem is a fundamental result in probability theory that provides conditions under which the sum of an infinite series of independent identically distributed random variables converges almost surely.
Suppose that X1,X2,X_1, X_2, \ldots is a sequence of independent random variables, Sn=X1++XnS_n = X_1 + \ldots + X_n, and let AA be the set of sample points ω\omega for which i>0Xi(ω)\sum_{i > 0} X_i(\omega) converges to a finite limit. It follows from Kolmogorov's zero-one law that P(A)=0P(A) = 0 or 1, i.e. the series i>0Xi(ω)\sum_{i > 0} X_i(\omega) converges or diverges almost surely (a.s.). The aim of this article is to give criteria that will determine whether a sum of independent random variables converges or diverges.

Theorem about convergence independent centered random variables

The foolowing theorem is Theorem 1, page 6 [Shiryaev]

This result is due to Kolmogorov and Khinchin.
Theorem (Kolmogorov and Khinchin)
Suppose that X1,X2,X_1, X_2, \ldots is a sequence of independent random variables and EXn=0,n1\mathrm{E} X_n=0, n \geq 1. If
nEXn2<,\begin{equation}\sum_n\mathrm{E} X_n^2<\infty,\end{equation}
then the series nXn\sum_n X_n converges a.s.
Moreover, if the random variables {Xn,n1} \{ X_n, n \geq 1 \}, are uniformly bounded (i.e. P(Xnc=1,c<)\mathrm{P}\left(\left|X_n\right| \leq c \right.=1, c < \infty )
the converse is true: the convergence of nXn\sum_n X_n a.s. implies (1).
\quad Proof. Necessity. The sequence {Sn,n1}\{ S_n , n \geq 1 \}, converges a.s., if and only if it is fundamental a.s.. The sequence {Sn,n1}\{ S_n , n \geq 1 \}, is fundamental a.s. if and only if
P{supk1Sn+kSnε}0,n.\begin{equation}\mathrm{P}\left\{\sup _{k \geq 1}\left|S_{n+k}-S_n\right| \geq \varepsilon\right\} \rightarrow 0, \quad n \rightarrow \infty .\end{equation}
From   \; Kolmogorov's inequality, we get
P{supk1Sn+kSnε}=limNP{max1kNSn+kSnε}limNk=nn+NEXk2ε2=k=nEXk2ε2.\begin{align*}\mathrm{P}\left\{\sup _{k \geq 1}\left|S_{n+k}-S_n\right| \geq \varepsilon\right\} & =\lim _{N \rightarrow \infty} \mathrm{P}\left\{\max _{1 \leq k \leq N}\left|S_{n+k}-S_n\right| \geq \varepsilon\right\} \\& \leq \lim _{N \rightarrow \infty} \frac{\sum_{k=n}^{n+N} \mathrm{E} X_k^2}{\varepsilon^2}=\frac{\sum_{k=n}^{\infty} \mathrm{E} X_k^2}{\varepsilon^2} .\end{align*}
Therefore (2) is satisfied if k=1EXk2<\sum_{k=1}^{\infty} E X_k^2<\infty, and consequently Sn S_n is Cauchy sequence with a.s. and hence limnSn(ω)\lim_{n \rightarrow \infty} S_n(\omega) exists a.s.
Sufficiency. Now, let kXk\sum_k X_k converge. Then, by (2), for sufficiently large nn,
P{supk1Sn+kSnε}<12.\begin{equation}\mathrm{P}\left\{\sup _{k \geq 1}\left|S_{n+k}-S_n\right| \geq \varepsilon\right\}<\frac{1}{2} .\end{equation}
By the second part of   \; Kolmogorov's inequality,
P{supk1Sn+kSnε}1(c+ε)2k=nEXk2.\begin{equation*}\mathbf{P}\left\{\sup _{k \geq 1}\left|S_{n+k}-S_n\right| \geq \varepsilon\right\} \geq 1-\frac{(c+\varepsilon)^2}{\sum_{k=n}^{\infty} \mathrm{E} X_k^2} .\end{equation*}
Therefore if we suppose that k=1EXk2=\sum_{k=1}^{\infty} E X_k^2=\infty, we obtain
P{supk1Sn+kSnε}=1,\begin{equation*}\mathrm{P}\left\{\sup _{k \geq 1}\left|S_{n+k}-S_n\right| \geq \varepsilon\right\}=1,\end{equation*}
which contradicts (3).
This completes the proof of the theorem. \Box

Kolmogorov's Three-Series Theorem

The foolowing theorem is Theorem 3, page 9 [Shiryaev]

Let cc be a constant and
Xc={X,Xc,0,X>c.\begin{equation*}X^c= \begin{cases}X, & |X| \leq c, \\ 0, & |X|>c .\end{cases}\end{equation*}
Theorem (Kolmogorov's Three-Series Theorem)
Let X1,X2,X_1, X_2, \ldots be a sequence of independent random variables. A necessary and sufficient condition for the convergence of Xn\sum X_n a.s. is that the series
EXnc,VXnc,P(Xnc)\begin{equation}\sum \mathrm{E} X_n^c, \quad \sum \mathrm{V} X_n^c, \quad \sum \mathrm{P}\left(\left|X_n\right| \geq c\right)\end{equation}
converge for some c>0c>0.
\end{theorem}
Let's prove this theorem.

Proof of Sufficiency

\quad Proof. To prove it, let μn=EXnc\mu_n = \mathrm{E} X_n^c. Convergence of VXnc\sum \mathrm{V} X_n^c and Theorem of Kolmogorov and Khinchin imply that n=1(Xncμn) \sum_{n=1}^{\infty} \left( X_n^c - \mu_n \right) converges a.s. Convergence of μn\sum \mu_n now gives that Xnc\sum X_n^c converges a.s.
But if P(Xnc)<\sum \mathrm{P}\left(\left|X_n\right| \geq c\right)<\infty, then by the Borel-Cantelli lemma there is number MM such that m>M,  Xmc\forall m > M, \; X_m \leq c . Therefore Xn=XncX_n=X_n^c for all nn starting from MM. Therefore Xn\sum X_n also converges a.s..

Proof of Necessity

If Xn\sum X_n converges a.s. then Xn0X_n \rightarrow 0 a.s., and therefore, for every c>0c>0, at most a finite number of the events {Xnc}\left\{\left|X_n\right| \geq c\right\} can occur a.s.. Therefore I(Xnc)<\sum I\left(\left|X_n\right| \geq c\right)<\infty a.s., and, by the second part of the Borel-Cantelli lemma, P(Xn>c)<\sum \mathrm{P}\left(\left|X_n\right|>c\right)<\infty. Moreover, the convergence of Xn\sum X_n implies the convergence of Xnc\sum X_n^c. To prove of convergence of the both series EXnc\sum \mathrm{E} X_n^c and VXnc\sum \mathrm{V} X_n^c we will use symmetrization method. In addition to the sequence
X1c,X2c,X_1^c, X_2^c, \ldots, we consider a different sequence, X1c^,X2c^,\hat{X_1^c}, \hat{X_2^c}, \ldots of independent random variable such XncX_n^c has the same distribution as Xnc^,nn\hat{X_n^c}, \: n\geq n.
Then if nXnc\sum_nX_n^c converges a.s., the series nX1c^\sum_n\hat{X_1^c} also converges, and hence so does n(XncXnc^)\sum_n \left(X_n^c-\hat{X_n^c}\right). But E(XnXnc^)=0\mathrm{E}\left(X_n-\hat{X_n^c}\right)=0 and P(XncXnc^2c)=1\mathrm{P} \left( \left| X_n^c-\hat{X_n^c} \right| \leq 2 c\right)=1. Therefore nV(XncXnc^)<\sum_n\mathrm{V}\left(X_n^c-\hat{X_n^c}\right)<\infty by Theorem of Kolmogorov and Khinchin. In addition,
nVXn=12nV(XncXnc^)<.\begin{equation}\sum_n\mathrm{V} X_n=\frac{1}{2} \sum_n\mathrm{V}\left(X_n^c-\hat{X_n^c}\right)<\infty .\end{equation}
Consequently, by Theorem of Kolmogorov and Khinchin, (XncEXnc^)\sum\left(X_n^c-\mathrm{E} \hat{X_n^c}\right) converges witha.s. , and therefore nEXnc\sum_n\mathrm{E} X_n^c converges.
Thus if nXnc\sum_nX_n^c converges a.s. (and P(Xncc)=1,n1\mathrm{P}\left(\left|X_n^c\right| \leq c\right)=1, n \geq 1 ) it follows that both nEXnc\sum_n\mathrm{E} X_n^c and nVXnc\sum_n\mathrm{V} X_n^c converge. \Box

References