Table of Contents
ToggleRandom Experiments and Sample Space
Randon Experiments: An experiment is called random if it results in two or more outcomes and it is not possible to tell (predict) the outcome in advance.
Sample Space: The collection of all possible outcomes of a random experiment is called sample space.
Sample Space of throwing a die
Sample Space (S) = {1, 2, 3, 4, 5, 6}
Cardinal Number n(S) = 6
Events
A subset of the sample space associated with a random experiment is called an event.
Types of event
- Compound event:
- Sure event:
- Impossible event :
- Equally likely outcomes:
Compound Event: A compound event is an event that consists of two or more simple events, also known as elementary events. These events are considered together as a single event.
For example, if you are rolling a six-sided die and you want to know the probability of getting either a 4 or a 5, the event “rolling a 4 or a 5” is a compound event.
Sure Event (Certain Event): A sure event, also known as a certain event, is an event that is guaranteed to happen. Its probability is 1, which means it will occur with certainty.
For example, if you toss a fair coin, the event “getting either heads or tails” is a sure event because one of these outcomes is guaranteed.
Impossible Event: An impossible event is an event that cannot happen. Its probability is 0, which means it will not occur under any circumstances.
For example, if you roll a six-sided die and want to know the probability of getting a 7, this is an impossible event because the die only has numbers from 1 to 6.
Equally Likely Outcomes: Equally likely outcomes refer to a situation where all possible outcomes of an event have the same probability of occurring.
For instance, when rolling a fair six-sided die, each of the six faces (1, 2, 3, 4, 5, and 6) has an equal probability of 1/6 of being rolled, assuming the die is fair. In this case, the outcomes are equally likely.
Algebra of events
- Complement of an event: The complement of an event is the set of all outcomes not belonging to that event. Denote – E’ or Ec
- The event A or B (A ∪ B): The event A or B (A ∪ B) represents the union of events A and B, including all outcomes that are in either A, B, or both.
- The event A and B (A ∩ B): Event A and B (A ∩ B) represents the intersection of events A and B, consisting of outcomes that are common to both events.
- The event A but not B (A – B): “A but not B” means outcomes that are in event A but not in event B. Also A ∩ B’
Mutually exclusive events
Two or more events associated with a random experiment are called mutually exclusive if the occurrence of any one of them excludes the occurrence of the other events.
i.e. (A ∩ B) = Ø
Exhaustive events
The events A and B associated with a random experiment with sample space S are called exhaustive events if (A ∪ B) = S
Mutually exclusive and exhaustive events
Let S be the sample space associated with a random experiment, then the events A and B of S are called Mutually exclusive and exhaustive if
- (A ∩ B) = Ø
- (A ∪ B) = S
Probability
Probability is the measure of uncertainty.
P(A) = Number\ of\ outcomes\ favourable\ to\ E\over Total\ number\ of\ possible\ outcomes
Note:
- 0 ≤ P(A) ≤ 1
- P (A) + P(A’) = 1 or P (A) + P(not A) = 1
- P (Sure event) = 1
- P (Impossible event) = 0
Odd in favour and odds against
Odd in favour = Number\ of\ favourable\ outcomes\over Number\ of\ unfavourable\ outcomes
Odds against = Number\ of\ unfavourable\ outcomes \over Number\ of\ favourable
Laws of Probability
Theorem 1
If A and B are events associated with a random experiment having Sample Space S and if A ⊂ B, then
- P (A) ≤ P(B)
- P(B – A) = P(B) – P(A)
Theorem 2
If A and B are events associated with a random experiment having Sample Space S, then
- P (A ∪ B) = P(A) + P(B) – P(A ∩ B)
- P (A ∪ B) = P(A) + P(B) [For mutually exclusive]
- P (A ∪ B) = P(A) + P(B) = 1 [For mutually exclusive and exhaustive]
- P(A ∪ B ∪ C) = P(A) + P(B) + P(C) – P(A ∩ B) – P(B ∩ C) – P(A ∩ C) + P(A ∩ B ∩ C)
- P(A ∪ B ∪ C) = P(A) + P(B) + P(C) [For mutually exclusive]
- P(A ∪ B ∪ C) = P(A) + P(B) + P(C) = 1 [For mutually exclusive and exhaustive]
Theorem 3
If A and B are events associated with a random experiment having Sample Space S, then
- P(A ∪ B) = P(A – B) + P (B – A) + P (A ∩ B)
- P(A ∪ B) = P(A) + P(B – A) = P(B) + P(A – B)
- P(A) = P (A – B) + P (A ∩ B)
- P(B) = P (B – A) + P (A ∩ B)
- P(A – B) = P(A ∩ B’)
- P (B – A) = P(A’ ∩ B)
- P(A) + P(B) = P(A – B) + P (B – A) + 2P (A ∩ B)
Conditional Probability
If A and B are two events associated with the same sample space of a random experiment, the conditional probability of the event A given that B has occurred, i.e. P (A|B) is given by
P (A|B) = P(A∩B)\over P(B)
Also, the conditional probability of the event BA given that A has occurred, i.e. P (B|A) is given by
P (B|A) = P(A∩B)\over P(A)
Properties of conditional probability
Let A and B be events of a sample space S of an experiment, then we have
- P(S|B) = P(B|B) = 1
- P(A′|B) = 1 − P(A|B)
Multiplication Theorem on Probability
- P(A ∩ B) = P(A) P(B|A)
- P(A ∩ B) = P(B) P(A|B)
provided P(A) ≠ 0 and P(B) ≠ 0
Independent Events
Two events E and F are said to be independent, if
- P(B|A) = P (B) provided P (A) ≠ 0
- P (A|B) = P (A) provided P (B) ≠ 0
Let A and B be two events associated with the same random experiment, then A and B are said to be independent if
P(A ∩ B) = P(A) . P (B)
Remark:
Three events A, B and C are said to be mutually independent, if
- P(A ∩ B) = P(A) P(B)
- P(A ∩ C) = P(A) P(C)
- P (B ∩ C) = P(B) P(C)
- P(A ∩ B ∩ C) = P(A) P (B) P(C)
Theorem of total probability
Let {E1 , E2 ,…,En } be a partition of the sample space S, and suppose that each of the events E1 , E2 ,…, En has a nonzero probability of occurrence. Let A be any event associated with S, then
P(A) = P(E1) P(A|E1) + P(E2) P(A|E2) + … + P(En) P(A|En)
or, P(A) = \sum_{i=1}^{n}{P(E_i)P({A| E_i})}
Bayes’ Theorem
If E1, E2,…, En are n non-empty events which constitute a partition of sample space S, i.e. E1, E2, …, En are pairwise disjoint and E1∪ E2∪ … ∪ En = S and A is any event of nonzero probability, then
P(Ei|A) = P(A|E_i)\over \sum_{i=1}^{n}{P(E_i)P({A| E_i})}