LECTURE 5
Probability
Learning Objectives
In this chapter, you learn:
Probability concepts
Conditional probability
Bayes’ Theorem
Counting rules
Random Experiments
A random experiment is an observational process
whose outcomes cannot be known in advance.
The set of all outcomes is the sample space for
the experiment.
A sample space with a countable number of
outcomes is discrete.
Example:
Flip a coint, the sample space consists of 2
outcomes S = {H, T}
Roll a die, the sample space consists of 6 outcomes
S = {1, 2, 3, 4, 5, 6}
Events
An event is any subset of outcomes in the
sample space.
A simple event or elementary event is a single
outcome.
A discrete sample space S consists of all the
simple events (Ei): S = {E1, E2, …, En}
Examples
Flip a coin:
The sample space consists
of 2 elementary events: S = {H, T}
An event of getting one Head: E = {H}
Flip a coin twice:
The sample space consists of 4 elementary
events: S = {HH, HT, TH, TT}
An event of getting one Head is a compound
event: E = {HT, TH}
Example
Roll a die, the sample
space consis of 6
elememtary events:
S = {1, 2, 3, 4, 5, 6}
Roll 2 dice, the sample
space consists of 36
elementary events:
S = {(1,1), (1,2),…
…(6,5), 6,6)}
Probability
The probability of an event is a number that
measures the relative likelihood that the event will
occur.
The probability of event A, denoted P(A), must lie
within the interval from 0 to 1:
0 < P(A) < 1
If P(A) = 0, then the event If P(A) = 1, then the event
cannot occur.
is certain to occur.
Assessing Probability
There are three approaches to assessing the
probability of an uncertain event:
Complement of an Event
The complement of an event A is denoted by A′
(or Ā) and consists of everything in the sample
space except event A.
A and A′ together comprise
the entire sample space:
P(A) + P(A′ ) = 1 or
P(A′ ) = 1 – P(A)
Intersection of Two Events
The intersection of two events A and B
(denoted A B or “A and B”) is the event
consisting of all outcomes in the sample space
that are contained in both event A and event B.
Examples: A day is Wed. and in January
may be read as “and” since
both events occur. This is a
joint probability.
Union of Two Events
The union of two events consists of all outcomes
in the sample space that are contained either in
event A or in event B or both (denoted A B or “A
or B”).
Examples: A day in January or February
may be read as “or” since
one or the other or both
events may occur.
General Law of Addition
The general law of addition states that the
probability of the union of two events A and B is:
P(A B) = P(A) + P(B) – P(A B)
When you add the A and B So, you have to
P(A) and P(B) subtract P(A B) to
together, you count the avoid over-stating
P(A and B) twice. the probability.
A B
Mutually Exclusive Events
Events that cannot occur simultaneously
Example:
Event A = a day in January. Even B = a day
in February
Events A and B are mutually exclusive
(or disjoint) if their intersection is null ().
If A B = , then P(A B) = 0
Special Law of Addition
In the case of mutually exclusive
events, the addition law reduces to:
P(A B) = P(A) + P(B)
Collectively Exhaustive Events
One of the events must occur
The set of events covers the entire sample
space
Example: Randomly choose a day from 2010
A = Weekday; B = Weekend;
C = January; D = Spring;
Events A, B, C and D are collectively exhaustive
(but not mutually exclusive – a weekday can be in
January or in Spring)
Events A and B are collectively exhaustive and
also mutually exclusive
Conditional Probability
The probability of event A given that
event B has occurred.
Denoted P(A | B). The vertical line “ | ” is
read as “given.”
P( A B) for P(B) > 0 and undefined
P( A | B)
P( B) otherwise
Independent Events
Two events are independent if and only if:
P(A | B) P(A)
Events A and B are independent when the
probability of one event is not affected by the
fact that the other event has occurred
Multiplication Rules
Multiplication rule for two events A and B:
P(A B) P(A | B) P(B)
Note: If A and B are independent, then P(A | B) P(A)
and the multiplication rule simplifies to
P(A B) P(A) P(B)
Odds of an Event
The odds in favor of event A The odds against event A
Number of Event A Number of Event Not A
Number of Event Not A Number of Event A
The odds in favor of Red The odds against Red
Relationship with Probability
The odds in favor of event A occurring is
P ( A) P ( A)
Odds =
P ( A ') 1 P ( A)
The odds against event A occurring is
P ( A) 1 P ( A)
Odds
P ( A) P ( A)
Contingency Table
Collect data of 100 cars:
Each car either has AC or no AC
Each car either has GPS or no GPS
GPS No GPS Total
AC 35 55 90
No AC 5 5 10
Total 40 60 100
Contingency Table
Of the 100 cars studied, 90% have air conditioning (AC)
and 40% have a GPS.
35% of the cars have both.
GPS No GPS Total
AC 0.35 0.55 0.90
No AC 0.05 0.05 0.10
Total 0.40 0.60 1.00
Conditional probability
P(GPS AC) 0.35
P(GPS | AC) 0.3889
P(AC) 0.90
Decision Trees
.35
S .90 P(AC ∩ GPS) = 0.35
Given AC or G P
Has
no AC: 0 .9
C )= D oe
P( A s
have not P(AC ∩ GPS’) = 0.55
A C GPS .55
H as .90
All Conditional
Probabilities
Cars
Do .05
e
hav s not .10 P(AC’ ∩ GPS) = 0.05
eA P(A P S
C G
C ’) Has
=0
.1
D oe
s
have not
GPS .05 P(AC’ ∩ GPS’) = 0.05
.10
Decision Trees
.35
C
.40 P(GPS and AC) = 0.35
Given GPS Has
A
or no GPS: = 0 .4
S )
( G P D oe
P s
have not .05 P(GPS and AC’) = 0.05
P S AC
sG
Ha .40
All Conditional
Probabilities
Cars
Do
e .55
hav s not
eG C
.60 P(GPS’ and AC) = 0.55
PS P(G A
PS Has
’) =
0.6
D oe
s
have not .05 P(GPS’ and AC’) = 0.05
AC
.60
Bayes’ Theorem
Bayes’ Theorem is used to revise
previously calculated probabilities based
on new information.
Developed by Thomas Bayes in the 18th
Century.
It is an extension of conditional
probability.
Bayes’ Theorem
The prior (marginal) probability of an
event B is revised after event A has been
considered to yield a posterior
(conditional) probability.
Bayes’ formula is:
P( A | B) P( B)
P ( B | A)
P ( A)
Bayes’ Theorem
In situations where P(A) is not given, the
form of Bayes’ Theorem is:
P( A | B) P( B)
P ( B | A)
P ( A | B ) P ( B ) P ( A | B ') P ( B ')
General Forms of Bayes’ Theorem
P(A | B i )P(B i )
P(B i | A)
P(A | B 1 )P(B 1 ) P(A | B 2 )P(B 2 ) P(A | B k )P(B k )
where:
Bi = ith event of k mutually exclusive and
collectively
exhaustive events
A = new event that might impact P(Bi)
Bayes’ Theorem Example
The entire output of a factory is produced on
two machines, which accounts for 60% and
40% of the output, repsectively.
The fraction of defective items for the 1 st
machine is 5% and for the 2nd machine is 3% .
Randomly select one product and found to be
is defective. What is the probability that it was
produced by the 1st machine?
Bayes’ Theorem Example
(continued)
Denote:
A1: item was made by the 1st machine
A2: item was made by the 2nd machine
B: item was defective
Probability:
P(A1) = 0.6 , P(A2) = 0.4
P(B|A1) = 0.05 , P(B|A2) = 0.03
Goal is to find P(A1|B)
Bayes’ Theorem Example
(continued)
P(B) = P(B|A1)P(A1) + P(B|A2) P(A2)
= (0.05)(0.6) + (0.03)(0.4)
= 0.03 + 0.012 = 0.042
4.2% of the factory output is defective
Bayes’ Theorem Example
(continued)
Apply Bayes’ Theorem:
P(B |A1)P(A 1 )
P(A1 | B)
P(B)
(0.05)(0.6 )
0.7143
(0.042)
So the revised probability that the item was made by the
1st machine, given that this item was defective, is 0.7143
Counting Rules
Rules for counting the number of possible
outcomes
Counting Rule 1:
If any one of k different mutually exclusive and
collectively exhaustive events can occur on each of
n trials, the number of possible outcomes is equal to
kn
Example: If you roll a fair die 3 times then there are
63 = 216 possible outcomes
Counting Rules
(continued)
Counting Rule 2:
If there are k1 events on the first trial, k2 events on
the second trial, … and kn events on the nth trial, the
number of possible outcomes is
(k1)(k2)…(kn)
Example:
You want to go to a park, eat at a restaurant, and see a
movie. There are 3 parks, 4 restaurants, and 6 movie
choices. How many different possible combinations are
there?
Answer: (3)(4)(6) = 72 different possibilities
Counting Rules
(continued)
Counting Rule 3:
The number of ways that n items can be arranged in
order is
n! = (n)(n – 1)…(1)
Example:
You have five books to put on a bookshelf. How many
different ways can these books be placed on the shelf?
Answer: 5! = (5)(4)(3)(2)(1) = 120 different possibilities
Counting Rules
(continued)
Counting Rule 4:
Permutations: The number of ways of arranging X
objects selected from n objects in order is
n!
n Px
(n X)!
Example:
You have five books and are going to put three on a
bookshelf. How many different ways can the books be
ordered on the bookshelf?
Answer: different
n! 5! 120
n Px
possibilities 60
(n X)! (5 3)! 2
Counting Rules
(continued)
Counting Rule 5:
Combinations: The number of ways of selecting X
objects from n objects, irrespective of order, is
n!
n Cx
X!(n X)!
Example:
You have five books and are going to randomly select three
to read. How many different combinations of books might
you select?
Answer: n! 5! 120 different
n Cx 10
possibilities X!(n X)! 3! (5 3)! (6)(2)
Chapter Summary
Discussed basic probability concepts
Sample spaces and events, simple probability, and joint
probability
Examined basic probability rules
General addition rule, addition rule for mutually exclusive events,
rule for collectively exhaustive events
Defined conditional probability
Statistical independence, marginal probability, decision trees,
and the multiplication rule
Discussed Bayes’ theorem
Discussed various counting rules
Homeworks
Ebook: Chaper 5
5.78
5.82
5.88
5.93
5.98