We remarked in the context of inclusion-exclusion formulas that often the probabilities of intersections of events is easy to find, and then we can use them to find probabilities of unions etc. In many contexts, this is related to one of the most important notions in probability.

 

Definition 61
Let $A,B$ be events in a common probability space. We say that $A$ and $B$ are independent is $\mathbf{P}(A\cap B)=\mathbf{P}(A)\mathbf{P}(B)$.

 

Example 62
Toss a fair coin $n$ times. Then $\Omega=\{\underline{\omega}{\; : \;} \underline{\omega}=(\omega_{1},\ldots ,\omega_{n}), \omega_{i}\mbox{ is }0\mbox{ or }1\}$ and $p_{\underline{\omega}}=2^{-n}$ for each $\underline{\omega}$. Let $A=\{\underline{\omega}{\; : \;} \omega_{1}=0\}$ and let $B=\{\underline{\omega}{\; : \;} \omega_{2}=0\}$. Then, from the definition of probabilities, we can see that $\mathbf{P}(A)=1/2$, $\mathbf{P}(B)=1/2$ (because the elementary probabilities are equal, and both the sets $A$ and $B$ contain exactly $2^{n-1}$ elements). Further, $A\cap B=\{\underline{\omega}{\; : \;} \omega_{1}=1, \omega_{2}=0\}$ has $2^{n-2}$ elements, whence $\mathbf{P}(A\cap B)=1/4$. Thus, $\mathbf{P}(A\cap B)=\mathbf{P}(A)\mathbf{P}(B)$ and hence $A$ and $B$ are independent.
If two events are independent, then the probability of their intersection can be found from the individual probabilities. How do we check if two events are independent? By checking if the probability of the event is equal to the product of the individual probabilities! It seems totally circular and useless! There are many reasons why it is not an empty notion as we shall see.

Firstly, in physical situationsdependence is related to a basic intuition we have about whether two events are related or not. For example, suppose you are thinking of betting Rs.1000 on a particular horse in a race. If you get the news that your cousin is getting married, it will perhaps not affect the amount you plan to bet. However, if you get the news that one of the other horses has been injected with undetectable drugs, it might affect the bet you want to place. In other words, certain events (like marriage of a cousin) have no bearing on the probability of the event of interest (the event that our horse wins) while other events (like the injection of drugs) do have an impact. This intuition is often put into the very definition of probability space that we have.

For example, in the above example of tossing a fair coin $n$ times, it is our intuition that a coin does not remember how it fell previous times, and that chance of its falling head in any toss is just $1/2$, irrespective of how many heads or tails occured before It may be better to attribute this to experience rather than intuition. There have been reasonable people in history who believed that if a coin shows heads in ten tosses in a row, then on the next toss it is more likely to show tails (to `compensate' for the overabundance of heads)! Clearly this is also someone's intuition, and different from ours. Only experiment can decide which is correct, and any number of experiments with real coins show that our intuition is correct, and coins have no memory. And this intuition was used in defining the elementary probabilities as $2^{-n}$ each. Since we started with the intuitive notion of independence, and put that into the definition of the probability space, it is quite expected that the event that the first toss is a head should be independent of the event that the second toss is a tail. That is the calculation shown in above.

But how is independence useful mathematically if the conditions to check independence are the very conclusions we want?! The answer to this lies in the following fact (to be explained later). When certain events are independent, then many other collections of events that can be made out of them also turn out to be independent. For example, if $A,B,C,D$ are independent (we have not yet defined what this means!), then $A\cup B$ and $C\cup D$ are also independent. Thus, starting from independence of certain events, we get independence of many other events. For example, any event depending on the first four tosses is independent of eny event depending on the next five tosses.

Chapter 10. Conditional probability and independence