Chapter1 Generating Functions MTH2102
Chapter1 Generating Functions MTH2102
Generating functions
In Elements of Probability and Statistics, a first year course unit, we found it very hard or
almost impossible to determine the k th moment of a normal distribution. For that matter,
we start this course unit, Probability Theory with methods that can be employed to generate
moments of random variables and probability distributions. In particular, we use these methods
to determine the mean and variances of random variables and probability distributions.
Chapter Objectives
At the end of this chapter, you should be able to:
• determine the moment generating function for a given random variable or probability distri-
bution.
• use the moment generating function to determine the mean and variance of a given random
variable or probability distribution.
• determine the probability generating function of an integer valued random variable or prob-
ability distribution and use it to determine its mean and variance.
2. If the random variable is bounded, then, the moment generating function exists for all values
of t ∈ R. If the random variable is not bounded, then, the moment generating function may
or may not exist for some values of t.
3. Suppose that the moment generating function of X exists in some interval containing 0,
then, the derivative MX0 (t) exists at a point t = 0.
1
2 CHAPTER 1. GENERATING FUNCTIONS
Under suitable conditions, it is possible to change the order of expectation and differentia-
tions. Thus, we have,
d d
E etX |t=0
MX (t) |t=0 =
dt dt
d tX
= E e |t=0
dt
= E XetX |t=0
= E(X).
More generally, if the moment generating function of X exists for all values of t in an interval
containing 0, then, it is possible to differentiate MX (t) any arbitrary number of times at the
point t = 0.
It can be shown that
(n) dn
MX (t) = MX (t)
dtn
dn
= n E etX
dt
n
d tX
=E e
dtn
= E X n etX .
(n)
Thus, MX (0) = E(X n ).
Clearly, the moment generating function can be used to obtain the nth moment of any
random variable as, E(X n ) = MXn (0), thus the name ‘moment generating function.’
In particular,
MX0 (0) = E(X), MX00 (0) = E(X 2 )
and
V ar(X) = MX00 (0) − (MX0 (0))2
1
Example 1.1.1 Suppose X has range {1, 2, . . . , n} and PX (j) = for 1 ≤ j ≤ n. Then,
n
tX
MX (t) = E e
n
X 1 tj
= e
j=1
n
1 t
e + e2t + · · · + ent
=
n
(ent − 1)
= et .
n(et − 1)
Remark 1.1.1 We have used a formula for finding the first n terms of a geometric progression
which is given as:
rn − 1
Sn = a
r−1
where a is the first term and r the common ratio.
1.1. MOMENT GENERATING FUNCTIONS 3
Example 1.1.2 A discrete random variable X takes on values 0, 1, 2 with respective proba-
1 3 1
bilities , , . Find MX (t), the moment generating function of X at any point t and verify
2 8 8
that E(X) = MX0 (0) and E(X 2 ) = MX00 (0).
Solution :
MX (t) = E(eXt )
X2
= etx P (X = x)
x=0
1 3 t 1 2t
= + e + e .
2 8 8
3 t 2 2t
MX0 (t) = e + e .
8 8
3 t 4 2t
MX00 (t) = e + e .
8 8
3 2
MX0 (0) = +
8 8
5
= .
8
3 4
MX00 (0) = +
8 8
7
= .
8
2
X
E(X) = xP (X = x)
x=0
1 3 1
= 0× +1× +2×
2 8 8
3 2
= 0+ +
8 8
0
= MX (0).
X2
2
E(X ) = x2 P (X = x)
x=0
1 3 1
= 02 × + 12 × + 22 ×
2 8 8
3 4
= 0+ +
8 8
= MX00 (0).
Since MX (t) is finite in an interval containing 0, then, all moments of X exist. Verify that the
first moment is 1 and the second moment is 2.
MX (t) = E etX
ˆ b tX
e
= dx
a b−a
1 tX b
= e a
t(b − a)
etb − eta
= .
(b − a)t
Can you determine the mean and variance of X from the moment generating function of X?
You will have to use L’Hopitals rule.
Theorem 1.1.1 Let X be a random variable with moment generating function MX (t).
If Y = aX + b where a and b are constant, then, MY (t) = ebt MX (at).
Proof .
MY (t) = E etY
= E et(aX+b)
= etb E(eatX )
= etb MX (at).
Example 1.1.5 A discrete random variable X is said to have the moment generating function
1 3 t 1 2t
φ(t) = + e + e
2 8 8
Let Y = 3X + 2. Determine the moment generating function of Y and use it to determine the
probability mass function of Y .
1.2. MOMENT GENERATING FUNCTIONS OF COMMON DISTRIBUTIONS 5
Solution :
MY (t) = E etY
= E e3tX+2t
= E e3tX e2t
= e2t E e3tX
= e2t MX (3t)
2t 1 3 3t 1 6t
= e + e + e
2 8 8
1 2t 3 5t 1 8t
= e + e + e
2 8 8
1 t 2 3 t 5 1 t 8
= e + e + e .
2 8 8
We derive the probability function of Y from the moment generating function of Y
by noting that the power of the exponent is the value that the random variable Y
takes on and the coefficient of the exponent is the corresponding probability.
Thus, the probability function of Y is shown in the table below.
y 2 5 8
1 3 1
fY (y)
2 8 8
MX (t) = E etX
n
tx n
X
= e px q n−x
x=0
x
n
X n x
= pet q n−x (sum of binomial coefficients)
x=0
x
n
= pet + q .
pqet
MX0 (t) = ,
(1 − qet )2
pq pq q
MX0 (0) = 2
= 2 = 2.
(1 − q) p p
q
Verify that the variance of a Geometric random variable is equal to .
p2
Remark 1.2.1
1. We have used a formula for obtaining a sum to infinity of a geometric series provided the
∞
a
ari = 1−r
P
absolute of the common ratio is less than 1, i.e, |r| < 1. The sum is given as ,
i=0
where a is the first term and r is the common ratio.
2. The Geometric distribution can be given in another way as:
f (x, p) = pq x−1 , for x = 1, 2, 3, . . .
where x is the number of trials before a success. Work out the moment generating function of
X and use it to find its mean and variance in this case.
MX (t) = E etX
∞
−λ
X (et λ)x
= e
x=0
x!
t
= e−λ eλe
Verify that E(X) = MX0 (0) = λ and V ar(X) = MX00 (0) − (MX0 (0))2 = λ
MX (t) = E etX
ˆ ∞
1 −1 (x−µ)
2
= √ etx e 2 σ2 dx
σ 2π −∞
ˆ ∞
1 1 2 2 2
= √ e− 2σ2 [x −2xµ+µ −2σ tx] dx
σ 2π −∞
ˆ ∞
1 1 2 2 2
= √ e− 2σ2 [x −2x(µ+σ t)+µ ] dx.
σ 2π −∞
Verify that
1. MX (0) = 1.
where Z ∼ N (0, 1)
8 CHAPTER 1. GENERATING FUNCTIONS
MX (t) = E etX
ˆ ∞
= λetx e−λx dx
0
ˆ ∞
= λ e−(λ−t)x dx
0
λ
= .
λ−t
1 1
Provided t < λ. Verify that the mean of X is and the variance of X is 2 .
λ λ
xα−1 e−βx β α
f (x; α, β) = for x, β, α > 0.
Γ(α)
α α
The mean and variance of this Gamma distribution is β
and β2
, respectively.
1.3. FURTHER EXAMPLES 9
1 n x
f (x, n) = n n
x 2 −1 e− 2 ; x > 0.
Γ 2 2
2
1 n
MX (t) = n = (1 − 2t)− 2 . (1.2)
(1 − 2t) 2
MY (t) = E(etY )
2
= E(etZ )
ˆ ∞
2 1 −1 2
= etz √ e 2 z dz
2π
ˆ−∞
∞
1 1 2
= √ e− 2 (1−2t)z dz.
−∞ 2π
√
By letting s = ( 1 − 2t)z, we get
ˆ ∞
1 1 1 2
MY (t) = √ √ e− 2 s ds
1 − 2t −∞ 2π
1 1
= √ t< , (1.3)
1 − 2t 2
ˆ ∞
1 1 2
since √ e− 2 s ds = 1 being the integral of the p.d.f. for a standard normal over R. If we
−∞ 2π
compare (1.3) with (1.2), we deduce using Theorem 1.1.2, that the square of a standard normal
yields a chi-square with 1 degree of freedom.
Solution :
MX (t) = E etX
∞
e−λ X λx etx
=
1 − e−λ x=1 x!
∞
!
e−λ X λx etx
= −1
1 − e−λ x=0
x!
e−λ λet
= e − 1
1 − e−λ
e−λ t λet
MX0 (t) = λe e
1 − e−λ
e−λ t λet 2 2t λet
MX00 (t) = λe e + λ e e
1 − e−λ
Then,
λ
E(X) = MX0 (0) =
1 − e−λ
(λ + λ2 )
2
E(X ) = MX00 (0) =
1 − e−λ
λ λ2 e−λ
= − .
1 − e−λ (1 − e−λ )2
Example 1.3.2 Find the moment generating function for a continuous random variable X
e−λx
with values in [0, ∞) and density fX (x) = λ(λx)n−1 .
(n − 1)!
Solution :
MX (t) = E etX
ˆ ∞ n n−1
λ x
= e−λx etx dx
0 (n − 1)!
ˆ ∞
λn
= xn−1 e−(λ−t)x dx.
(n − 1)! 0
1.3. FURTHER EXAMPLES 11
y dy
Let y = (λ − t)x ⇒ x = and dx = . Therefore,
λ−t λ−t
ˆ ∞
n y n−1 −y dy
MX (t) = λ (n − 1)! e
0 (λ − t)n−1 λ−t
ˆ ∞
λn 1
= y n−1 e−y dy
(n − 1)! (λ − t)n 0
n
1 λ
= Γ(n)
(n − 1)! λ − t
n
λ (n − 1)!
=
λ−t (n − 1)!
n
λ
= .
λ−t
Example 1.3.3 Find the moment generating function for a continuous random variable X
with values in [0, ∞) and density fX (x) = 4xe−2x and use it to determine the standard deviation
of X.
Solution :
MX (t) = E etX
ˆ ∞
= 4 xetx e−2x dx
ˆ0 ∞
= 4 xe−(2−t)x dx.
0
y dy
Let y = (2 − t)x ⇒ x = and dx = . Therefore,
2−t 2−t
ˆ ∞
y −y dy
MX (t) = 4 e
0 2−t 2−t
2 ˆ ∞
2
= ye−y dy
2−t 0
2
2
= Γ(2)
2−t
2
2
= .
2−t
8
MX0 (t) = and
(2 − t)3
24
MX00 (t) = .
(2 − t)4
12 CHAPTER 1. GENERATING FUNCTIONS
Therefore,
Example 1.3.4 A random variable X is said to have the moment generating function MX (t) =
1 2 2
eµt+ 2 σ t , t ∈ R. Determine the variance of Y = eX .
Solution :
1 2 t2
MX (t) = MX (t) = eµt+ 2 σ , t ∈ R.
1 2
E(Y ) = E(eX ) = MX (1) = eµ+ 2 σ .
E Y 2 = E e2X
2
= MX (2) = e2µ+2σ .
V ar(Y ) = E(Y 2 ) − (EY )2
2 2
= e2µ+σ (eσ − 1).
GX (t) is called
P the probability generating function of a random variable X. It is said to exist
if the series ∞ n=0 |t n
|P (X = n) is convergent.
∞
X
G00X (t) = x(x − 1)tx−2 P (X = x). (1.5)
x=2
In general for k = 1, 2, . . . ,
∞
X
(k)
GX (t) = x(x − 1) . . . (x − k + 1)tx−k P (X = x). (1.6)
x=k
(x)
Putting t = 0, in (1.4)-(1.6) gives GX (0) = x!P (X = x) from which we get
(x)
G (0)
P (X = x) = X
x!
The variance of X is
V ar(X) = G00X (1) + G0X (1) − (G0X (1))2 .
Theorem 1.4.1 If two nonnegative, integer-valued random variables have the same generating
function, then, they must follow the same probability law. In other words, if X and Y are
nonnegative, integer valued random variables. If GX = GY , then, PX = PY .
GX (t) = E tX
∞ n −λ n
X t e λ
=
n=0
n!
∞
−λ
X (λt)n
= e
n=0
n!
−λ λt
= e e .
Therefore,
GX (t) = e−λ(1−t) for all t.
14 CHAPTER 1. GENERATING FUNCTIONS
GX (1) = 1.
G0X (t) = λe−λ(1−t) , G0X (1) = λ and G0X (0) = e−λ .
(k)
GX (t) = λk e−λ(1−t) and
(k)
GX (0) = λk e−λ .
GX (t) = E tX
X∞
= tx pq x
x=0
X∞
= p tx q x
x=0
X∞
= p (tq)x .
x=0
p 1
This is a geometric series whose sum to infinity is (provided that |t| ≤ ). Verify that
1 − tq q
pq 2pq 2
G0X (t) = and G00
X (t) = from which the mean and variance of X can be
(1 − qt)2 (1 − qt)3
determined.
Exercise 1.1
2. Use the moment generating function to show that if X follows an exponential distribution,
cX(c > 0) does also.
1.4. PROBABILITY GENERATING FUNCTIONS 15
e−λ λx
(b) P (X = x) = , x = 0, 1, . . .
x!
λx e−λ
(c) P (X = x) = , x = 1, 2, . . .
x! 1 − e−λ
(d) P (X = x) = pq x (1 − q N +1 ) − 1 for x = 0, 1, . . . , N.
(e) f (x) = x−1
k x−k
k−1
p q , x = k, k + 1, . . .
For each case find E(X) and V ar(X).
9. Let X be a random variable with probability generating function GX (t) at a point t. Find
the probability generating function of Y = aX + b where a and b are non-negative integers.
10. Suppose that X is a random variable for which the moment generating function is as follows:
1
3et + e−t for − ∞ < t < ∞.
φ(t) =
4
Determine the mean and variance of X.
11. Suppose that X is a random variable for which the moment generating function is as follows:
2 +3t
φ(t) = et for − ∞ < t < ∞.
Determine the mean and variance of X.
12. Let X be a random variable with mean µ and variance σ 2 and let φ(t) denote the moment
generating function of X for −∞ < t < ∞. Let c be a given positive constant and let Y be
a random variable for which the moment generating function is
φ2 (t) = ec(φ1 (t)−1) for − ∞ < t < ∞.
Find expressions for the mean and variance of Y in terms of the mean and variance of X.
16 CHAPTER 1. GENERATING FUNCTIONS
Show that the moment generating function of X exists for t ≤ 0 but not for t > 0.
14. Suppose that X is a random variable for which the moment generating function is defined
as
1 2 2
M (t) = et + e4t + e8t .
5 5 5
(a) Find the probability function of X.
(b) Let Y = 3X −5. Determine the moment generating function of Y and use it to determine
the probability function of Y .
15. Suppose that X is a random variable for which the moment generating function is defined
as
1
M (t) = (4 + et + e−t ).
6
(a) Find the probability function of X.
(b) Let Y = aX + b, a, b ∈ R. Determine the moment generating function of Y and use it
to determine the probability function of Y.
16. Suppose that X is a random variable for which the moment generating function is defined
as
2
M (t) = e3t+8t .
1
Let Y = (X − 3). Determine the moment generating function of Y and use it to determine
4
the mean and variance of Y .
t
17. Explain why the function defined by M (t) = cannot be a moment generating function
1−t
of some random variable.
18. Let CX (t) = ln(MX (t)) where MX (t) is the moment generating function of a given random
0 00
variable X at a point t. Show that CX (0) = E(X) and CX (0) = V ar(X). Use these results
to determine the mean and variance of a random variable X having the moment generating
function defined by
t
MX (t) = e4(e −1) .