Convergence
3 minute read
Convergence in Probability:
A sequence of random variables \(X_1, X_1, \dots, X_n\) is said to converge in probability
to a known random variable \(X\),
if for every number \(\epsilon >0 \), the following is true:
where,
\(X_n\): is the estimator or sample based random variable.
\(X\): is the known or limiting or target random variable.
\(\epsilon\): is the tolerance level or margin of error.
For example:
- Toss a fair coin:
Estimator:
\[ X_n = \begin{cases} \frac{n}{n+1} & \text{, if Head } \\ \\ \frac{1}{n} & \text{, if Tail} \\ \end{cases} \]
Known random variable (Bernoulli):
\[ X = \begin{cases} 1 & \text{, if Head } \\ \\ 0 & \text{, if Tail} \\ \end{cases} \]
Say, tolerance level \(\epsilon = 0.1\).
Then,
If n=5;
So, if n=5, then \(|X_n - X| > (\epsilon = 0.1)\).
=> \(P(|X_n - X| \ge (\epsilon=0.1)) = 1\).
if n=20;
So, if n=20, then \(|X_n - X| < (\epsilon=0.1)\).
=> \(P(|X_n - X| \ge (\epsilon=0.1)) = 0\).
=> \(P(|X_n - X| \ge (\epsilon=0.1)) = 0 ~\forall ~n \ge 10\)
Therefore,
Similarly, we can prove that if \(\epsilon = 0.01\), then the probability will be equal to 0 for \( n\ge 100 \).
Note: Task is to check whether the sequence of randome variables \(X_1, X_2, \dots, X_n\) converges in probability to a known random variable \(X\) as \(n\rightarrow\infty\).
So, we can conclude that, if \(n > \frac{1}{\epsilon}\), then:
Almost Sure Convergence:
A sequence of random variables \(X_1, X_2, \dots, X_n\) is said to almost surely converge to a known random variable \(X\),
for \(n \ge 1\), if the following is true:
where,
\(X_n\): is the estimator or sample based random variable.
\(X\): is the known or limiting or target random variable.
If, \(X_n \xrightarrow{Almost ~ Sure} X \), => \(X_n \xrightarrow{Probability} X \)
But, converse is NOT true.
*Note: Almost Sure convergence is hardest to satisfy amongst all convergence, such as, convergence in probability,
convergence in distribution, etc.
Read more about Limits
- \(X\) is random variable such that \(X = \frac{1}{2} \), a constant, i.e \(X_1 = X_2 = \dots = X_n = \frac{1}{2}\).
\(Y_1, Y_2,\dots ,Y_n \) are another sequence of random variables, such that :
\[ Y_1 = X_1 \\[10pt] Y_2 = \frac{X_1 + X_2}{2} \\[10pt] Y_3 = \frac{X_1 + X_2 + X_3}{3} \\[10pt] \dots \\ Y_n = \frac{1}{n} \sum_{i=1}^{n} X_i \xrightarrow{Almost ~ Sure} \frac{1}{2} \]
End of Section