Basic Probability
By
Dr. Abu Hamja
Objectives
The objectives for this chapter are:
◼ To understand basic probability concepts.
◼ To understand conditional probability
◼ To be able to use Bayes’ Theorem
◼ To learn various counting rules
Basic Probability Concepts
◼ Probability – the chance that an uncertain event
will occur (always between 0 and 1)
◼ Impossible Event – an event that has no
chance of occurring (probability = 0)
◼ Certain Event – an event that is sure to occur
(probability = 1)
Assessing Probability
There are three approaches to assessing
the probability of an uncertain event:
1. a priori -- based on prior knowledge of the process
X number of ways in which the event occurs
probability of occurrence = =
T total number of possible outcomes
Assuming
all
outcomes 2. empirical probability
are equally
likely number of ways in which the event occurs
probability of occurrence =
total number of possible outcomes
3. subjective probability
based on a combination of an individual’s past experience,
personal opinion, and analysis of a particular situation
Example of a priori probability
When randomly selecting a day from the year 2015
what is the probability the day is in January?
X number of days in January
Probability of Day In January = =
T total number of days in 2015
X 31 days in January 31
= =
T 365 days in 2015 365
Example of empirical probability
Find the probability of selecting a male taking statistics
from the population described in the following table:
Taking Stats Not Taking Total
Stats
Male 84 145 229
Female 76 134 210
Total 160 279 439
number of males taking stats 84
Probability of male taking stats = = = 0.191
total number of people 439
Subjective probability
◼ Subjective probability may differ from person to person
◼ A media development team assigns a 60%
probability of success to its new ad campaign.
◼ The chief media officer of the company is less
optimistic and assigns a 40% of success to the same
campaign
◼ The assignment of a subjective probability is based on a
person’s experiences, opinions, and analysis of a
particular situation
◼ Subjective probability is useful in situations when an
empirical or a priori probability cannot be computed
Events
Each possible outcome of a variable is an event.
◼ Simple event
◼ An event described by a single characteristic
◼ e.g., A day in January from all days in 2015
◼ Joint event
◼ An event described by two or more characteristics
◼ e.g. A day in January that is also a Wednesday from all days in
2015
◼ Complement of an event A (denoted A’)
◼ All events that are not part of event A
◼ e.g., All days from 2015 that are not in January
Sample Space
The Sample Space is the collection of all
possible events
e.g. All 6 faces of a die:
e.g. All 52 cards of a bridge deck:
Organizing & Visualizing Events
◼ Venn Diagram For All Days In 2015
Sample Space (All Days Days That Are In January and Are
In 2015) Wednesdays
January Days
Wednesdays
Organizing & Visualizing Events
(continued)
◼ Contingency Tables -- For All Days in 2015
◼ Decision Trees Total
4 Number
Sample Of
Space 27 Sample
All Days Space
In 2015 Outcomes
48
286
Definition: Simple Probability
◼ Simple Probability refers to the probability of a
simple event.
◼ ex. P(Jan.)
◼ ex. P(Wed.)
Jan. Not Jan. Total
P(Wed.) = 52 / 365
Wed. 4 48 52
Not Wed. 27 286 313
Total 31 334 365
P(Jan.) = 31 / 365
Definition: Joint Probability
◼ Joint Probability refers to the probability of an
occurrence of two or more events (joint event).
◼ ex. P(Jan. and Wed.)
◼ ex. P(Not Jan. and Not Wed.)
Jan. Not Jan. Total
P(Not Jan. and Not Wed.)
Wed. 4 48 52
= 286 / 365
Not Wed. 27 286 313
Total 31 334 365
P(Jan. and Wed.) = 4 / 365
Mutually Exclusive Events
◼ Mutually exclusive events
◼ Events that cannot occur simultaneously
Example: Randomly choosing a day from 2015
A = day in January; B = day in February
◼ Events A and B are mutually exclusive
Collectively Exhaustive Events
◼ Collectively exhaustive events
◼ One of the events must occur
◼ The set of events covers the entire sample space
Example: Randomly choose a day from 2015
A = Weekday; B = Weekend;
C = January; D = Spring;
◼ Events A, B, C and D are collectively exhaustive (but
not mutually exclusive – a weekday can be in
January or in Spring)
◼ Events A and B are collectively exhaustive and also
mutually exclusive
Computing Joint and
Marginal Probabilities
◼ The probability of a joint event, A and B:
number of outcomes satisfying A and B
P( A and B) =
total number of elementary outcomes
◼ Computing a marginal (or simple) probability:
P(A) = P(A and B1 ) + P(A and B 2 ) + + P(A and Bk )
◼ Where B1, B2, …, Bk are k mutually exclusive and collectively
exhaustive events
Joint Probability Example
Jan. Not Jan. Total
Wed. 4 48 52
Not Wed. 27 286 313
Total 31 334 365
Marginal Probability Example
Jan. Not Jan. Total
Wed. 4 48 52
Not Wed. 27 286 313
Total 31 334 365
Marginal & Joint Probabilities In A
Contingency Table
Event
Event B1 B2 Total
A1 P(A1 and B1) P(A1 and B2) P(A1)
A2 P(A2 and B1) P(A2 and B2) P(A2)
Total P(B1) P(B2) 1
Joint Probabilities Marginal (Simple) Probabilities
Probability Summary So Far
◼ Probability is the numerical measure
of the likelihood that an event will 1 Certain
occur
◼ The probability of any event must be
between 0 and 1, inclusively
0 ≤ P(A) ≤ 1 For any event A 0.5
◼ The sum of the probabilities of all
mutually exclusive and collectively
exhaustive events is 1
P(A) + P(B) + P(C) = 1
0 Impossible
If A, B, and C are mutually exclusive and
collectively exhaustive
General Addition Rule
General Addition Rule:
P(A or B) = P(A) + P(B) - P(A and B)
If A and B are mutually exclusive, then
P(A and B) = 0, so the rule can be simplified:
P(A or B) = P(A) + P(B)
For mutually exclusive events A and B
General Addition Rule Example
P(Jan. or Wed.) = P(Jan.) + P(Wed.) - P(Jan. and Wed.)
= 31/365 + 52/365 - 4/365 = 79/365
Don’t count
the four
Wednesdays
in January
Jan. Not Jan. Total twice!
Wed. 4 48 52
Not Wed. 27 286 313
Total 31 334 365
Computing Conditional
Probabilities
◼ A conditional probability is the probability of one
event, given that another event has occurred:
P(A and B) The conditional
P(A | B) = probability of A given
P(B) that B has occurred
P(A and B) The conditional
P(B | A) = probability of B given
P(A) that A has occurred
Where P(A and B) = joint probability of A and B
P(A) = marginal or simple probability of A
P(B) = marginal or simple probability of B
Conditional Probability Example
◼ Of the cars on a used car lot, 70% have air
conditioning (AC) and 40% have a GPS. 20%
of the cars have both.
◼ What is the probability that a car has a GPS,
given that it has AC ?
i.e., we want to find P(GPS | AC)
Conditional Probability Example
(continued)
◼ Of the cars on a used car lot, 70% have air conditioning
(AC) and 40% have a GPS and
20% of the cars have both.
GPS No GPS Total
AC 0.2 0.5 0.7
No AC 0.2 0.1 0.3
Total 0.4 0.6 1.0
P(GPS and AC) 0.2
P(GPS | AC) = = = 0.2857
P(AC) 0.7
Conditional Probability Example
(continued)
◼ Given AC, we only consider the top row (70% of the cars). Of
these, 20% have a GPS. 20% of 70% is about 28.57%.
GPS No GPS Total
AC 0.2 0.5 0.7
No AC 0.2 0.1 0.3
Total 0.4 0.6 1.0
P(GPS and AC) 0.2
P(GPS | AC) = = = 0.2857
P(AC) 0.7
Independence
◼ Two events are independent if and only if:
P(A | B) = P(A)
◼ Events A and B are independent when the probability of
one event is not affected by the fact that the other event
has occurred
Multiplication Rules
◼ Multiplication rule for two events A and B:
P(A and B) = P(A | B) P(B)
Note: If A and B are independent, then P(A | B) = P(A)
and the multiplication rule simplifies to
P(A and B) = P(A) P(B)
Marginal Probability
◼ Marginal probability for event A:
P(A) = P(A | B1 ) P(B1 ) + P(A | B 2 ) P(B 2 ) + + P(A | Bk ) P(B k )
◼ Where B1, B2, …, Bk are k mutually exclusive and
collectively exhaustive events
Bayes Theorem
◼ Suppose sample space has been partitioned in
to “n” mutually exclusive events. B is an event
having intersection with all those mutually
excusive events
◼ If you need to determine the probability of
something occurring given that another
condition exists that can influence the
occurrence, you would use Bayes' theorem
B
A A A A A
1 2 3 4 n
Applications of Bayesian Model
◼ Business and Commerce - pricing decisions, retail and wholesale prices,
the size of the market, and market share.
◼ New product development - project risk by weighing uncertainties and
determining if the project is worth it. Online shopping giants like Amazon
use it to make ratings appear natural when searching for products, as
opposed to displaying rankings in order the default option.
◼ Marketing
◼ Stock Markets
◼ Weather Prediction
◼ Disease Risk
◼ Medical Diagnosis Most doctors use Bayesian inference without realizing
it. They get a sick patient, look at their history, their lifestyle, and other
factors to determine what problem the patient may have. Bayesian
analysis can even be used to fill in incomplete medical records based on
the history and trends of the individual.
Bayes Theorem
P( Ai ) P( B | Ai )
P( Ai | B) =
P( A1 ) P( B | A1 ) + P( A2 ) P( B | A2 ) + ... + P( An ) P( B | An )
Case Study of Bayes Theorem
◼ A lot of products is produced by three machines A1, A2, A3. One
unit from the lot is selected at random and found defective.
❑ What is the probability that the unit was produced by M/C A2?
❑ What is the Probability that a randomly selected product is
defective?
M/C % Production Defective Items
A1 40% 5%
A2 45% 10%
A3 15% 1%
Counting Rules Are Often Useful In
Computing Probabilities
◼ In many cases, there are a large number of
possible outcomes.
◼ Counting rules can be used in these cases to
help compute probabilities.
Counting Rules
◼ Rules for counting the number of possible
outcomes
◼ Counting Rule 1:
◼ If any one of k different mutually exclusive and
collectively exhaustive events can occur on each of
n trials, the number of possible outcomes is equal to
kn
◼ Example
◼ If you roll a fair die 3 times then there are 63 = 216 possible
outcomes
Counting Rules (continued)
◼ Counting Rule 2:
◼ If there are k1 events on the first trial, k2 events on
the second trial, … and kn events on the nth trial, the
number of possible outcomes is
(k1)(k2)…(kn)
◼ Example:
◼ You want to go to a park, eat at a restaurant, and see a
movie. There are 3 parks, 4 restaurants, and 6 movie
choices. How many different possible combinations are
there?
◼ Answer: (3)(4)(6) = 72 different possibilities
Counting Rules (continued)
◼ Counting Rule 3:
◼ The number of ways that n items can be arranged in
order is
n! = (n)(n – 1)…(1)
◼ Example:
◼ You have five books to put on a bookshelf. How many
different ways can these books be placed on the shelf?
◼ Answer: 5! = (5)(4)(3)(2)(1) = 120 different possibilities
Counting Rules (continued)
◼ Counting Rule 4:
◼ Permutations: The number of ways of arranging X
objects selected from n objects in order is
n!
n Px =
(n − X)!
◼ Example:
◼ You have five books and are going to put three on a
bookshelf. How many different ways can the books be
ordered on the bookshelf?
n! 5! 120
◼ Answer: n Px = = = = 60 different possibilities
(n − X)! (5 − 3)! 2
Counting Rules (continued)
◼ Counting Rule 5:
◼ Combinations: The number of ways of selecting X
objects from n objects, irrespective of order, is
n!
n Cx =
X! (n − X)!
◼ Example:
◼ You have five books and are going to select three are to
read. How many different combinations are there, ignoring
the order in which they are selected?
n! 5! 120
◼ Answer: n Cx = = = = 10 different possibilities
X!(n − X)! 3!(5 − 3)! (6)(2)
Chapter Summary
In this chapter we covered:
◼ Understanding basic probability concepts.
◼ Understanding conditional probability
◼ Bayes’ Theorem Various counting rules
Any Questions?
Thank YOU