Let's start this topic with the definition of dependent and independent events.
Dependence and independence of two events
Events and are independent if . In other words, the occurrence of one does not affect the probability of the occurrence of the other.
If we unfold in the definition, we can rewrite the definition of independent events: , and we can get the following equality . This equality can be used as the definition of independent events.
The following events are independent:
-
rolling 4 on a single 6-sided die, rolling 1 on a second roll of the die;
-
dice were rolled. the number of points on the first one is greater than , number of points on the second one is greater than ;
-
Whole sample space and event with zero probability.
We will see some interesting properties of independent events in the next sections.
The pairwise and mutual independence
Now we know what the phrase "events and are independent" means. But what does the phrase "events are independent" mean? With many events, it's impossible to understand the type of independence.
Events are pairwise independent if every pair of events is independent: for all .
Events are mutually independent if for every and for any subset of events of size the following equality is true: .
It's easy to understand that if events are mutually independent, then they are pairwise independent too (put ), but the converse is not necessarily true.
Let's look at Bernstein's example, which shows the difference between these two definitions.
Suppose we have a regular tetrahedron which has one side colored red, the second side green, the third one blue and the last contains all three colors. The tetrahedron is dropped on the desk. There are three events: the lower side contains red, it contains blue, and it contains green.
It's clear that ( of sides are suitable) and . So events and are independent because . For other pairs, similarly. So, events are pairwise independent.
Let's check the definition of mutual independence: , but . So these events are not mutually independent.
Rules of sum, product, and complement
Rule of sum
The rule of the sum is the following equality: . We can understand this rule intuitively just by looking at the picture:
We need to subtract because we added it twice in .
But if the events and are mutually disjoint, which means , so the rule of sum looks simpler: .
Let's look at an example. A fair -side die is rolled. What is the probability that the roll is an even number (event ) or a prime (event ) or both? It's clear that and ( points). So, the answer .
Summarizing, the rule of sum helps us calculate the probability that an event or an event or both would happen.
Rule of product
Actually, we saw this rule in the first paragraph, but let's write it one more time.
We know that in general case. But if and are independent, then . The last equality is called the rule of product. We can calculate the probability that event and event will happen using this rule.
If you read other topics related to events and probabilities, you have already seen an example of rolling dice. And before you had to count all pairs, but now it's not necessary.
For example, what is the probability that each die will have a number of points greater than ? For each die, this probability is , so the answer is .
Notice once again, we could use this rule only when and are independent!
Complement rule
Suppose that we have some event . We can consider the set of all outcomes not contained in . This set is called the complement of . From the definition, it immediately follows that doesn't intersect with and . We can use the rule of the sum to get the complement rule: .
This equality is called the complement rule: .
By this rule, we can calculate the probability that event won't happen. For example, if it's known that the probability of rain today is , then the probability that it won't rain is .