So far we have defined the notion of probability space and probability of an event. But most often, we do not calculate probabilities from the definition. This is like in integration, where one defined the integral of a function as a limit of Riemann sums, but that definition is used only to find integrals of $x^{n}$, $\sin(x)$ and a few such functions. Instead, integrals of complicated expressions such as $x\sin(x)+2\cos^{2}(x)\tan(x)$ are calculated by various rules, such as substitution rule, integration by parts etc. In probability we need some similar rules relating probabilities of various combinations of events to the individual probabilities.
Notation : Let $A\subseteq \Omega$ be an event. Then, we define a function ${\mathbf 1}_{A}:\Omega\rightarrow \mathbb{R}$, called the indicator function of $A$, as follows. $$ {\mathbf 1}_{A}(\omega) = \begin{cases} 1 & \mbox{ if }\omega\in A,\\ 0 & \mbox{ if }\omega\not\in A. \end{cases} $$ Since a function from $\Omega$ to $\mathbb{R}$ is called a random variable, the indicator of any event is a random variable. All information about the event $A$ is in its indicator function (meaning, if we know the value of ${\mathbf 1}_{A}(\omega)$, we know whether or not $\omega$ belongs to $A$). For example, we can write $\mathbf{P}(A)=\sum_{\omega\in \Omega}{\mathbf 1}_{A}(\omega)p_{\omega}$.
Now we prove the proposition.
Two special cases are of increasing and decreasing sequences of events. This means $A_{1}\subseteq A_{2}\subseteq A_{3}\subseteq \ldots$ and $A_{1}\supseteq A_{2}\supseteq A_{3}\supseteq \ldots$. In these cases, the limsup and liminf are the same (so we refer to it as the limit of the sequence of sets). It is $\cup_{k}A_{k}$ in the case of increasing events and $\cap_{k}A_{k}$ in the case of decreasing events.
The basic rules of probability :