Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
66 views9 pages

Binomial and Poisson Probability Distributions

1) The document discusses the binomial and Poisson probability distributions. The binomial distribution describes the probability of successes in a fixed number of trials while the Poisson distribution describes the number of occurrences in a fixed time interval or space. 2) As the number of trials increases and the probability of success decreases in such a way that the expected number of successes remains constant, the binomial distribution can be approximated by the Poisson distribution. 3) For both distributions, the mean and variance are equal to the expected number of occurrences. The Poisson distribution has the advantage that the number of occurrences is not required to be an integer.

Uploaded by

Gharib Mahmoud
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
66 views9 pages

Binomial and Poisson Probability Distributions

1) The document discusses the binomial and Poisson probability distributions. The binomial distribution describes the probability of successes in a fixed number of trials while the Poisson distribution describes the number of occurrences in a fixed time interval or space. 2) As the number of trials increases and the probability of success decreases in such a way that the expected number of successes remains constant, the binomial distribution can be approximated by the Poisson distribution. 3) For both distributions, the mean and variance are equal to the expected number of occurrences. The Poisson distribution has the advantage that the number of occurrences is not required to be an integer.

Uploaded by

Gharib Mahmoud
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

K.K.

Gan L2: Binomial and Poisson 1


Lecture 2
Binomial and Poisson Probability Distributions
Binomial Probability Distribution
! Consider a situation where there are only two possible outcomes (a Bernoulli trial)
" Example:
# ipping a coin
$ head or tail
# rolling a dice
$ 6 or not 6 (i.e. 1, 2, 3, 4, 5)
" Label the probability of a success as p
$ the probability for a failure is then q = 1- p
! Suppose we have N trials (e.g. we ip a coin N times)
$ what is the probability to get m successes (= heads)?
! Consider tossing a coin twice. The possible outcomes are:
" no heads: P(m = 0) = q
2

" one head: P(m = 1) = qp + pq (toss 1 is a tail, toss 2 is a head or toss 1 is head, toss 2 is a tail)
= 2pq
" two heads: P(m = 2) = p
2

" P(0) + P(1) + P(2) = q
2
+ 2pq + p
2
= (q + p)
2
= 1
! We want the probability distribution function P(m, N, p) where:
m = number of success (e.g. number of heads in a coin toss)
N = number of trials (e.g. number of coin tosses)
p = probability for a success (e.g. 0.5 for a head)
James Bernoulli (Jacob I)
born in Basel, Switzerland
Dec. 27, 1654-Aug. 16, 1705
He is one 8 mathematicians
in the Bernoulli family
(from Wikipedia).
two outcomes because we don't care which of the tosses is a head
K.K. Gan L2: Binomial and Poisson 2
! If we look at the three choices for the coin ip example, each term is of the form:
C
m
p
m
q
N-m
m = 0, 1, 2, N = 2 for our example, q = 1 - p always!
" coefcient C
m
takes into account the number of ways an outcome can occur regardless of order
" for m = 0 or 2 there is only one way for the outcome (both tosses give heads or tails): C
0
= C
2
= 1
" for m = 1 (one head, two tosses) there are two ways that this can occur: C
1
= 2.
! Binomial coefcients: number of ways of taking N things m at time
" 0! = 1! = 1, 2! = 12 = 2, 3! = 123 = 6, m! = 123m
" Order of things is not important
# e.g. 2 tosses, one head case (m = 1)
& we don't care if toss 1 produced the head or if toss 2 produced the head
" Unordered groups such as our example are called combinations
" Ordered arrangements are called permutations
" For N distinguishable objects, if we want to group them m at a time, the number of permutations:
# example: If we tossed a coin twice (N = 2), there are two ways for getting one head (m = 1)
# example: Suppose we have 3 balls, one white, one red, and one blue.
& Number of possible pairs we could have, keeping track of order is 6 (rw, wr, rb, br, wb, bw):
& If order is not important (rw = wr), then the binomial formula gives
!
C
N,m
=
m
N
( )
=
N!
m!(N "m)!
!
P
N,m
=
N!
(N "m)!
!
P
3,2
=
3!
(3"2)!
= 6
!
C
3,2
=
3!
2!(3"2)!
= 3
number of two-color combinations
K.K. Gan L2: Binomial and Poisson 3
! Binomial distribution: the probability of m success out of N trials:
# p is probability of a success and q = 1 - p is probability of a failure
# Consider a game where the player bats 4 times:
" probability of 0/4 = (0.67)
4
= 20%
" probability of 1/4 = [4!/(3!1!)](0.33)
1
(0.67)
3
= 40%
" probability of 2/4 = [4!/(2!2!)](0.33)
2
(0.67)
2
= 29%
" probability of 3/4 = [4!/(1!3!)](0.33)
3
(0.67)
1
= 10%
" probability of 4/4 = [4!/(0!4!)](0.33)
4
(0.67)
0
= 1%
" probability of getting at least one hit = 1 - P(0) = 0.8
Expectation Value
= np = 50 * 1/3 = 16.667...
0.00
0.02
0.04
0.06
0.08
0.10
0.12
0.14
0 5 10 15 20 25 30
k
P
(
k
,

5
0
,

1
/
3
)
Expectation Value
= np = 7 * 1/3 = 2.333...
0.00
0.10
0.20
0.30
0.40
0 2 4 6 8 10
k
P
(
k
,

7
,

1
/
3
)
!
P(m, N, p) = C
N,m
p
m
q
N"m
=
m
N
( )
p
m
q
N"m
=
N!
m!(N "m)!
p
m
q
N"m
K.K. Gan L2: Binomial and Poisson 4
! To show that the binomial distribution is properly normalized, use Binomial Theorem:
$ binomial distribution is properly normalized
! Mean of binomial distribution:
" A cute way of evaluating the above sum is to take the derivative:
!
=
mP(m, N, p)
m=0
N
"
P(m, N, p)
m=0
N
"
= mP(m, N, p)
m=0
N
" = m
m
N
( )
p
m
q
N#m
m=0
N
"
!
"
"p
m
N
( )
p
m
q
N#m
m=0
N
$
%
&
'
(
)
*
= 0
m
m
N
( )
p
m#1
q
N#m
m=0
N
$ #
m
N
( )
p
m
(N #m)(1# p)
N#m#1
m=0
N
$ = 0
p
#1
m
m
N
( )
p
m
q
N#m
m=0
N
$ = N(1# p)
#1
m
N
( )
p
m
(1# p)
N#m
m=0
N
$ #(1# p)
#1
m
m
N
( )
p
m
(1# p)
N#m
m=0
N
$
p
#1
= N(1# p)
#1
+1#(1# p)
#1

= Np
!
(a+b)
k
=
l
k
( )
l=0
k
" a
k#l
b
l
P(m, N, p)
m=0
N
" =
m
N
( )
m=0
N
" p
m
q
N#m
= ( p+q)
N
=1
K.K. Gan L2: Binomial and Poisson 5
! Variance of binomial distribution (obtained using similar trick):
" Example: Suppose you observed m special events (success) in a sample of N events
# measured probability (efciency) for a special event to occur:
# error on the probability ("error on the efciency"):
$ sample (N) should be as large as possible to reduce uncertainty in the probability measurement
" Example: Suppose a baseball player's batting average is 0.33 (1 for 3 on average).
# Consider the case where the player either gets a hit or makes an out (forget about walks here!).
probability for a hit: p = 0.33
probability for no hit: q = 1 - p = 0.67
# On average how many hits does the player get in 100 at bats?
= Np = 1000.33 = 33 hits
# What's the standard deviation for the number of hits in 100 at bats?
! = (Npq)
1/2
= (1000.330.67)
1/2
" 4.7 hits
$ we expect " 33 5 hits per 100 at bats
!
" =
m
N
!
"
#
=
"
m
N
=
Npq
N
=
N#(1$#)
N
=
#(1$#)
N
!
"
2
=
(m#)
2
P(m, N, p)
m=0
N
$
P(m, N, p)
m=0
N
$
= Npq
K.K. Gan L2: Binomial and Poisson 6
Poisson Probability Distribution
! A widely used discrete probability distribution
! Consider the following conditions:
" p is very small and approaches 0
# example: a 100 sided dice instead of a 6 sided dice, p = 1/100 instead of 1/6
# example: a 1000 sided dice, p = 1/1000
" N is very large and approaches #
# example: throwing 100 or 1000 dice instead of 2 dice
" product Np is nite
! Example: radioactive decay
" Suppose we have 25 mg of an element
$ very large number of atoms: N " 10
20

" Suppose the lifetime of this element ! = 10
12
years " 5x10
19
seconds
$ probability of a given nucleus to decay in one second is very small: p = 1/! = 2x10
-20
/sec
$ Np = 2/sec nite!
$ number of counts in a time interval is a Poisson process
! Poisson distribution can be derived by taking the appropriate limits of the binomial distribution
!
P(m, N, p) =
N!
m!(N "m)!
p
m
q
N"m
N!
(N "m)!
=
N(N "1) # # # (N "m+1)(N "m)!
(N "m)!
= N
m
q
N"m
= (1" p)
N"m
=1" p(N "m) +
p
2
(N "m)(N "m"1)
2!
+# # # $1" pN +
( pN)
2
2!
+# # # $ e
"pN
Simon Denis Poisson
June 21, 1781-April 25, 1840
K.K. Gan L2: Binomial and Poisson 7
# m is always an integer $ 0
# does not have to be an integer
" It is easy to show that:
= Np = mean of a Poisson distribution
!
2
= Np = = variance of a Poisson distribution
! Radioactivity example with an average of 2 decays/sec:
" Whats the probability of zero decays in one second?
" Whats the probability of more than one decay in one second?
" Estimate the most probable number of decays/sec?
!
P(m, N, p) =
N
m
m!
p
m
e
"pN
Let = Np
P(m,) =
e
"

m
m!
e
"

m
m!
m=0
m=#
$ = e
"

m
m!
m=0
m=#
$ = e
"
e

=1
!
p(0,2) =
e
"2
2
0
0!
=
e
"2
#1
1
= e
"2
= 0.135 $13.5%
!
p(>1,2) =1" p(0,2) " p(1,2) =1"
e
"2
2
0
0!
"
e
"2
2
1
1!
=1"e
"2
"2e
"2
= 0.594 #59.4%
!
"
"m
P(m,)
m
*
= 0
Poisson distribution is normalized
mean and variance are the same number
K.K. Gan L2: Binomial and Poisson 8
# To solve this problem its convenient to maximize lnP(m, ) instead of P(m, ).
# In order to handle the factorial when take the derivative we use Stirling's Approximation:
$ The most probable value for m is just the average of the distribution
$ If you observed m events in an experiment, the error on m is
# This is only approximate since Stirlings Approximation is only valid for large m.
# Strictly speaking m can only take on integer values while is not restricted to be an integer.
!
ln m!" mln m#m
$
$m
ln P(m,) =
$
$m
(#+mln #ln m!)
"
$
$m
(#+mln #mln m+m)
= ln #ln m#m
1
m
+1
= 0
m
*
=
!
" = = m
!
ln P(m,) = ln
e
"

m
m!
#
$
%
&
'
(
= "+mln "ln m!
K.K. Gan L2: Binomial and Poisson 9
0
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0.4
P
r
o
b
a
b
i
l
i
t
y
0.0 1.0 2.0 3.0 4.0 5.0 6.0 7.0
m
binomial
poisson =1
N=10,p=0.1
0
0.1
0.2
0.3
0.4
0.5
P
r
o
b
a
b
i
l
i
t
y
0 1 2 3 4 5
m
poisson
binomial
=1
N=3, p=1/3
Comparison of Binomial and Poisson distributions with mean = 1
Not much
difference
between them!
For large N: Binomial distribution looks like a Poisson of the same mean

You might also like