Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
16 views45 pages

Probability Lecture 8

The document outlines the concepts of probability laws, including the Binomial and Geometric Probability Laws, and discusses random variables and their probability mass functions. It provides examples of calculating probabilities using these laws, particularly in the context of Bernoulli trials and independent experiments. Additionally, it explains how to use tree diagrams to visualize outcomes and probabilities in various scenarios.

Uploaded by

Es Rar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views45 pages

Probability Lecture 8

The document outlines the concepts of probability laws, including the Binomial and Geometric Probability Laws, and discusses random variables and their probability mass functions. It provides examples of calculating probabilities using these laws, particularly in the context of Bernoulli trials and independent experiments. Additionally, it explains how to use tree diagrams to visualize outcomes and probabilities in various scenarios.

Uploaded by

Es Rar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 45

Probability and Random

Variables
Dr. Sadiq Ali
Lecture Outline
• Binomial Probability Law
• Geometric Probability Laws
• Random Variables
• Probability Mass Function
Binomial Probability Law
Sequential Experiments
1. Independent Experiments

i. - Binomial Probability law

ii. - Multinomial Probability law

iii. - Geometric Probability law

2. Dependent Experiments

i. - Markov Chains
Multinomial Law: Generalizes the binomial law to multiple
outcomes per trial.
Bernoulli Trials
Bernoulli trial (or binomial trial):
• Random experiment with independent trials and in each
trial two possible outcomes: Success or Failure
• Same probability of success in each trial
• Example: Coin toss
Binomial Probability Law
• If we have 𝑛 independent Bernoulli trials, then the
probability of 𝑘 successes is given by the binomial
probability law,
n k n−k
pn k = p 1−p for k = 0, … , n
k
‒ pn k is probability of k successes in n trials
n n!
‒ = is the Binomial Coefficient
k n−k !k!

Note: If success happens 𝑘 times, failure will occur 𝑛−𝑘


times
Example 1: Binomial Probability Law
• A coin is tossed 3 times. Suppose the probability of
heads is 1/3. What is the probability of getting 1
heads in the 3 tosses?

𝑛 =?

𝑝 =?

𝑘 =?
Example 1: Binomial Probability Law
• Number of trials, n = 3

• Number of times success to happen, k = 1


3 1 2 2
p3 1 = p 1−p = 3p 1 − p
1
1
• Let heads is success, P success = p =
3

• Tails is failure, P failure = 1 − p = 2/3


p3 (1) = 3 × 0.331 × 1 − 0.33 2
= 𝟎. 𝟒𝟒
Example 1: Tree Diagram
Recall that we
previously solved this
problem using a tree
diagram

P[1 head] =
(1/3 × 2/3 × 2/3) + (2/3 × 1/3 × 2/3) + (2/3
× 2/3 × 1/3) = 12 / 27 = 𝟎. 𝟒𝟒
Example 2: Binomial Probability Law
• Binomial probability law makes finding the
probability for more complicated cases easier.
• Example: Obtaining 2 heads in 5 tosses with a fair
coin
5
p5 2 = 0.5 2 1 − 0.5 5−2
2
5 2 5
5!
= 0.5 0.5 = 0.5 5
2 5 − 2 ! 2!
5 × 4 × 3!
= × 0.03125 = 0.3125
3! 2!
Binomial Prob. Law & Tree Diagram
The Binomial probability law applies only to Bernoulli
trials and naturally follows from the tree diagram.
Example: Consider obtaining 20 heads in 50 tosses
(Bernoulli trials) using a biased coin with a probability of
heads 1/3. Suppose we could visualize the tree diagram
of this scenario. In that case, we will notice that in all 50
layers of the tree diagram, the probability of the branches
corresponding to the outcome “heads” will remain 𝟏/𝟑
throughout the tree diagram.
Binomial Prob. Law & Tree Diagram
• Similarly, the probability of the "tails" branch
remains 2/3 throughout the tree.

• Thus, the tree diagram has two outcomes—success


(heads) and failure (tails)—with constant
probabilities.

• This is because for each of these outcomes you will


have to pass through the head branches 20 times and
through the tail branches 30 times.
Binomial Prob. Law & Tree Diagram
• Each of these possible outcomes has exactly the same
probability. This is because for each of these
outcomes you will have to pass through the head
branches 20 times and through the tail braches 30
times.
• In other words, the probability of each outcome
would be the probability of heads to the power 20
multiplied by the probability of tails to the power 30.
Binomial Coefficient
• Bernoulli trials are like partitioning a set of 𝑛 distinct
objects into two sets:

– a set B containing 𝑘 objects that are selected

– a set BC containing remaining 𝑛 − 𝑘 objects.

• If success happens 𝑘 times, failure will occur


(𝑛 – 𝑘) times.

Ckn = Cn−k
n
Binomial Coefficient

• Recall: 20 heads in 50 tosses is the same as getting


30 or (50-20) tails in the 50 tosses.
50 50! 50!
C20 = 20! 50−20 !
=
20! × 30!
50 50! 50!
C30 = 30! 50−30 !
=
30! × 20!
Quiz 10 mins
A manufacturing company produces high-precision power
amplifiers. Each amplifier has a 97% chance of
functioning correctly. A quality control engineer
randomly selects 10 amplifiers from a day’s production.
What is the probability that exactly 8 out of the 10
amplifiers function correctly?
Solution Example 3
n! k n−k
P k = k! n−k !
p 1−p

10! 8 10−8
⇒P 8 = 8! 10−2 !
.97 0.03

⇒P(8) = 0.03176
Example 3
Question 1: The probability of error of a certain
communication channel is 1 − 𝑝 =0.001. The transmitter
transmits each information bit three times. At the receiver, a
decoder takes a majority vote of the received bits to decide
what the transmitted bit was.
Find the probability that the receiver will make an
incorrect decision.
Solution: Example 3
• The receiver makes an incorrect decision when none 𝑘 =
0 or only one 𝑘 = 1 of the three transmissions is correct,
considering a correct bit in a trial as a success.

– n = 3 (number of trials/transmissions)

– k = 0 or 1.
P Receiver makes incorrect decision = p3 0 + p3 1

3 0 3 3 1 2
= 0.999 0.001 + 0.999 0.001
0 1
Example 4 (Part A)
• Question 2: A block of 100 bits is transmitted over a
binary communication channel with a probability of
bit error 1 − 𝑝 = 0.01.
a) If the block has 1 or fewer errors, then the
receiver accepts the block. Find the probability
that the block is accepted.
b) If the block has more than 1 error, then the block
is retransmitted. Find the probability that M
retransmissions are required.
Solution Example 4 (Part A)
• Block is accepted if 99 or all 100 bits are correct.
P Block Accepted = p100 100 + p100 99

100 100 0
= P Success P Failure
100
100 99 1
+ P Success P Failure
99

• Where we have used the Binomial Probability Law


assuming that success refers to having a correct bit in
the block.
Solution: Example 4 (Part A)
• Probability of success is p = 0.99

• Probability of failure is 1 − p = 0.01


P Block Accepted

100 100 0 100 99 1


= p 1−p + p 1−p
100 99
= p100 + 100p99 1 − p 0
= 𝟎. 𝟕𝟑𝟓𝟕
Geometric Probability Law
Geometric Probability Law
Geometric probability law results from a random experiment
that meets all of the following requirements.
• Repeat independent trials with constant success
probability 𝑝 until the first success. Each trial results in
either success or failure 𝑞 = 1 − 𝑝 . The random
variable X counts the number of trials until the first
success.
• The formula for finding the probability that the first
success occurs after M trials (or after M -1 failures) is
𝐌−𝟏 𝐩
𝐏 𝐌 = 𝟏−𝐩
Geometric Probability Law
• Example: A coin is tossed until the first heads (success)
appears. That is, n = 1 to ∞ Bernoulli trials. What is the
probability of success (heads) in the Mth trial.
M−1
P M = 1−p p
P M = P Failure in first M − 1 trials Success in trial 𝑀

• Since trials are independent probabilities of success and


failure after each trial remains the same.
Geometric Probability Law

P M = 3 = P tails P tails P heads


P M = 3 = 1 − p 2p
Example 4 Part B
• Question 2: A block of 100 bits is transmitted over a
binary communication channel with probability of bit
error 1 − 𝑝 = 0.01.
• (a) If the block has 1 or fewer errors then the receiver
accepts the block. Find the probability that the block
is accepted.
• (b) If the block has more than 1 error, then the
block is retransmitted. Find the probability that M
retransmissions are required.
Example 4 Part B
• If the block is rejected in the first M trials and then
accepted in trial number M + 1 , we will have M
retransmissions (Trials).
Example 4 Part B
• If the block accepted is considered success and block
rejected is taken as failure, then the probability of failure in
the first M trials and success in trial number M + 1 is given
by the Geometric Probability Law.
P Block is acceted in Tranmisstion number M + 1
M+1 −1 p
= 1−p
Example 4 Part B

• P Number of retransmissions is M = 1 − p Mp

• where the probability of success 𝑝 denotes the


probability that the block is accepted, which is 0.7357
(see part a)

• P M retransmissions = 1 − 0.7357 M 0.7357


Random Variables
Function
• A function is a relationship between members of a set with
another set.
• Example: the function 𝑓(𝑥) = 𝑥2 maps a member of the set 𝑋
to a member of the set 𝑌.
1 1
2 4
3 9
⋮ ⋮
⋮ 𝑓 𝑥 ⋮

𝑋 𝑌
Function
• Domain of the function 𝑓(𝑥) is the set X and the Range of the
function 𝑓(𝑥) is the set Y.

Domain Range
1 1
Domain of Range of the
the Function: 2 4
Set from 3 9 Function: Set
which an on to which
element is
⋮ ⋮
mapped. ⋮ 𝑓 𝑥 ⋮ an element is
mapped.

𝑋 𝑌
Random Variable
• A random variable is a function that maps elements from
the sample space to another set.
• The random variable X(s) maps outcomes of the sample
space, S, to the set, SX. s0 x0
s1 x1
• SX gives the range of values, s3 x3
SX = {x0, x1, … . , xn} that ⋮ X sj = xj ⋮
the random variable, X, can sj xj
take. ⋮ ⋮
sn xn

S SX
Random Variable – Example 1
• Let the random variable, X, be defined as the number
of heads in 3 tosses of a fair coin. Show the mapping
from S to SX.

X = Head count in all possible outcomes


Random Variable – Example 1
SX
S
3

HHH HHT HTH THH 2

THT TTH HTT TTT 1

0
X
Random Variable - Example
• Let the random variable, X, be defined as the number
of tails in 3 tosses of a fair coin. Show the mapping
from S to SX.

X = Tails count in all possible outcomes


Random Variable - Example
SX
S
0

HHH HHT HTH THH 1

THT TTH HTT TTT 2

3
X
Probability Mass Function
Probability Mass Function (pmf)
probability

• The probability that a discrete random variable X


will exactly equal a given value xj is expressed by
the pmf as:
pX x j = P X = x j

• For Example if the random variable X is defined as


the number of heads in the 3 tosses, then the pmf of
X is given by the probability that the random
variable takes on the value 0, 1, 2, 3.
Probability Mass Function (pmf)
pX x j = P X = x j

• pX 0 = P X = 0 = P TTT = 18

• pX 1 = P X = 1 = P HTT + THT + TTH = 38

• pX 2 = P X = 2 = P HHT + HTH + THH = 38

• pX 3 = P X = 3 = P HHH = 18

Curly Brackets denote outcomes


Probability Mass Function (pmf)
• The pmf values are defined as probabilities, therefore,
pmf values must be non-negative,

pX xj ≥ 0 for all x

• Also, the sum of all pmfs is 1

෍ pX x j = 1
all j
Probability Mass Function (pmf)
• We can check this last equation by adding pmf values of
the 3 coin tosses example.

෍ pX x j = pX 0 + pX 1 + pX 2 + pX 3
all j

෍ pX xj = 1ൗ8 + 3ൗ8 + 3ൗ8 + 1ൗ8


all j

෍ pX xj = 8ൗ8 = 1
all j
Graph of pmf
• Usually, when we refer to pmf, we are referring to a
graph of the pmf.

• The graph of pmf is given by the graph of these


probabilities 𝑝𝑋 𝑥 versus 𝑥 (values that the random
variable can take).

• Draw the pmf of the random variable X, where X is


defined as the number of heads in 3 tosses.
Graph of pmf
𝑃𝑋 𝑥

pmf of X
𝟑
𝟖

𝟏
𝟖

0 1 2 3 𝑥

You might also like