Aiphabet
>Courses>Math Lessons>Probability>Probability>Intersection and Union of Events

Intersection and Union of Events

6 Intersection of events

We often like to calculate the joint probability of events; this will depend on whether they are independent or not:

Definition 8 Two events A and B are independent provided that their joint distribution is the product of their marginal distributions:

P(AB)=P(A)P(B)P(A \cap B) = P(A)P(B)

We denote A and B independent as follows: A ⁣ ⁣ ⁣BA \perp\!\!\!\perp B

Another way to see this is to reorder the terms in the formula for conditional probability (as probability of event A is not affected by event B, that is

P(AB)=P(A)P(A | B) = P(A)

P(AB)P(B)=P(AB)P(A|B)P(B) = P(A \cap B)

P(AB)P(B)=P(A)P(B)=P(AB)P(A|B)P(B) = P(A)P(B) = P(A \cap B)

Note: Probability of intersection of mutually independent events is the product of their probabilities.

Example 1: We toss a fair coin three times. Consider the following events:

  • A1A_1: Event of obtaining Tails in the first toss.
  • A2A_2: Event of obtaining Tails in the second toss.
  • A3A_3: Event of obtaining Heads in the third toss.
  1. What is the probability of A1A_1, A2A_2, and A3A_3 happening?

P(A1A2A3)=P(A1)P(A2)P(A3)P(A_1 \cap A_2 \cap A_3) = P(A_1) \cdot P(A_2) \cdot P(A_3)

Since it is a fair coin:

P(A1A2A3)=121212P(A_1 \cap A_2 \cap A_3) = \frac{1}{2} \cdot \frac{1}{2} \cdot \frac{1}{2}

Checking independence

To check two events are independent, just check that the joint probability is equal to the product of the marginal probabilities.

For instance: P(hot,sun)=0.4P(\text{hot}, \text{sun}) = 0.4, P(hot)=0.5P(\text{hot}) = 0.5, P(sun)=0.5P(\text{sun}) = 0.5,

P(hot,sun)P(hot)×P(sun)P(\text{hot}, \text{sun}) \neq P(\text{hot}) \times P(\text{sun}).

If the events are not mutually independent, how do we calculate their joint probability? From the conditional probability formula, we have:

P(AB)=P(A)P(BA)P(A \cap B) = P(A)P(B|A)

We can extend it to three events, as follows:

P(ABC)=P(AB)P(CAB)=P(A)P(BA)P(CAB)P(A \cap B \cap C) = P(A \cap B)P(C|A \cap B) = P(A)P(B|A)P(C|A \cap B)

We call this formula the chain rule that is very useful to estimate the joint probability of non independent events in experiments involving a sequence of choices. We generalize the rule as follows:

Definition 9 (Chain rule): For any events A1,A2,...,AnA_1, A_2, ..., A_n:

P(A1A2...An)=P(A1)P(A2A1)P(A3A1A2)...P(Ani=1n1Ai)P(A_1 \cap A_2 ... \cap A_n) = P(A_1)P(A_2|A_1)P(A_3|A_1 \cap A_2)...P(A_n|\bigcap_{i=1}^{n-1}A_i)

An alternative way to write it is:

P(A1A2...An)=i=1nP(AiA1...Ai1)P(A_1 \cap A_2 ... \cap A_n) = \prod_{i=1}^n P(A_i|A_1 \cap ... \cap A_{i-1})

Note:

The intersection of events is equivalent to the notion of conjunction. In other words, A1A2A_1 \cap A_2 means event A1A_1 happened and event A2A_2 happened. To express the disjunction between events, we will use the union, as defined in the next section.

7 Union of events

Events are sets. So, it follows that theorems or principles that apply to sets, such as the principle of inclusion-exclusion (PIE), which we used to calculate the size of unions of sets, also apply to events.

Here we are counting the size of the union proportionally to the size of the sample space.

P(A1A2A3)=P(A1)+P(A2)+P(A3)P(A1A2)P(A1A3)P(A2A3)+P(A1A2A3)P(A_1 \cup A_2 \cup A_3) = P(A_1) + P(A_2) + P(A_3) - P(A_1 \cap A_2) - P(A_1 \cap A_3) - P(A_2 \cap A_3) + P(A_1 \cap A_2 \cap A_3)

More generally:

P(i=1nAi)=i=1nP(Ai)+(1)i<jP(AiAj)+i<j<kP(AiAjAk)+...+(1)n1P(i=1nAi)P(\bigcup_{i=1}^n A_i) = \sum_{i=1}^n P(A_i) + (-1)\sum_{i<j}P(A_i \cap A_j) + \sum_{i<j<k}P(A_i \cap A_j \cap A_k) + ... + (-1)^{n-1}P(\bigcap_{i=1}^n A_i)