"OR" or Unions
Two events are mutually exclusive if they cannot occur at the same time. Another word that means mutually exclusive is disjoint.
If two events are disjoint, then the probability of them both occurring at the same time is 0.
Disjoint: P(A and B) = 0
If two events are mutually exclusive, then the probability of either occurring is the sum of the probabilities of each occurring.
Only valid when the events are mutually exclusive.
P(A or B) = P(A) + P(B)
Given: P(A) = 0.20, P(B) = 0.70, A and B are disjoint
I like to use what's called a joint probability distribution. (Since disjoint means nothing in common, joint is what they have in common -- so the values that go on the inside portion of the table are the intersections or "and"s of each pair of events). "Marginal" is another word for totals -- it's called marginal because they appear in the margins.
B | B' | Marginal | |
A | 0.00 | 0.20 | 0.20 |
A' | 0.70 | 0.10 | 0.80 |
Marginal | 0.70 | 0.30 | 1.00 |
The values in red are given in the problem. The grand total is always 1.00. The rest of the values are obtained by addition and subtraction.
In events which aren't mutually exclusive, there is some overlap. When P(A) and P(B) are added, the probability of the intersection (and) is added twice. To compensate for that double addition, the intersection needs to be subtracted.
Always valid.
P(A or B) = P(A) + P(B) - P(A and B)
Given P(A) = 0.20, P(B) = 0.70, P(A and B) = 0.15
B | B' | Marginal | |
A | 0.15 | 0.05 | 0.20 |
A' | 0.55 | 0.25 | 0.80 |
Marginal | 0.70 | 0.30 | 1.00 |
Certain things can be determined from the joint probability distribution. Mutually exclusive events will have a probability of zero. All inclusive events will have a zero opposite the intersection. All inclusive means that there is nothing outside of those two events: P(A or B) = 1.
B | B' | Marginal | |
A | A and B are Mutually Exclusive if this value is 0 | . | . |
A' | . | A and B are All Inclusive if this value is 0 | . |
Marginal | . | . | 1.00 |
Two events are independent if the occurrence of one does not change the probability of the other occurring.
An example would be rolling a 2 on a die and flipping a head on a coin. Rolling the 2 does not affect the probability of flipping the head.
If events are independent, then the probability of them both occurring is the product of the probabilities of each occurring.
Only valid for independent events
P(A and B) = P(A) * P(B)
P(A) = 0.20, P(B) = 0.70, A and B are independent.
B | B' | Marginal | |
A | 0.14 | 0.06 | 0.20 |
A' | 0.56 | 0.24 | 0.80 |
Marginal | 0.70 | 0.30 | 1.00 |
The 0.14 is because the probability of A and B is the probability of A times the probability of B or 0.20 * 0.70 = 0.14.
If the occurrence of one event does affect the probability of the other occurring, then the events are dependent.
The probability of event B occurring that event A has already occurred is read "the probability of B given A" and is written: P(B|A)
Always works.
P(A and B) = P(A) * P(B|A)
P(A) = 0.20, P(B) = 0.70, P(B|A) = 0.40
A good way to think of P(B|A) is that 40% of A is B. 40% of the 20% which was in event A is 8%, thus the intersection is 0.08.
B | B' | Marginal | |
A | 0.08 | 0.12 | 0.20 |
A' | 0.62 | 0.18 | 0.80 |
Marginal | 0.70 | 0.30 | 1.00 |
The following four statements are equivalent
The last two are because if two events are independent, the occurrence of one doesn't change the probability of the occurrence of the other. This means that the probability of B occurring, whether A has happened or not, is simply the probability of B occurring.
Continue with conditional probabilities.