Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
10 views16 pages

Chapter1 Generating Functions MTH2102

Probability
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views16 pages

Chapter1 Generating Functions MTH2102

Probability
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

Chapter 1

Generating functions

In Elements of Probability and Statistics, a first year course unit, we found it very hard or
almost impossible to determine the k th moment of a normal distribution. For that matter,
we start this course unit, Probability Theory with methods that can be employed to generate
moments of random variables and probability distributions. In particular, we use these methods
to determine the mean and variances of random variables and probability distributions.
Chapter Objectives
At the end of this chapter, you should be able to:

• determine the moment generating function for a given random variable or probability distri-
bution.

• use the moment generating function to determine the mean and variance of a given random
variable or probability distribution.

• determine the probability generating function of an integer valued random variable or prob-
ability distribution and use it to determine its mean and variance.

1.1 Moment generating functions


Definition 1.1.1 The
 moment generating function MX (t) of a random variable X is defined
tX
as MX (t) = E e . If X is a continuous random variable, then MX (t) is said to exist if
´∞ tx P tx
e f (x)dx < ∞. If X is discrete, then, MX (t) is said to exist if e P (X = x) < ∞.
−∞ all x

1.1.1 Properties of the moment generating function


1. The moment generating function exists at a point t = 0.
Its value at this point is given by MX (0) = E(1) = 1.

2. If the random variable is bounded, then, the moment generating function exists for all values
of t ∈ R. If the random variable is not bounded, then, the moment generating function may
or may not exist for some values of t.

3. Suppose that the moment generating function of X exists in some interval containing 0,
then, the derivative MX0 (t) exists at a point t = 0.

1
2 CHAPTER 1. GENERATING FUNCTIONS

Under suitable conditions, it is possible to change the order of expectation and differentia-
tions. Thus, we have,
d d
E etX |t=0

MX (t) |t=0 =
dt dt
 
d tX 
= E e |t=0
dt

= E XetX |t=0
 

= E(X).
More generally, if the moment generating function of X exists for all values of t in an interval
containing 0, then, it is possible to differentiate MX (t) any arbitrary number of times at the
point t = 0.
It can be shown that
(n) dn
MX (t) = MX (t)
dtn
dn
= n E etX

dt
 n 
d tX
=E e
dtn

= E X n etX .


(n)
Thus, MX (0) = E(X n ).
Clearly, the moment generating function can be used to obtain the nth moment of any
random variable as, E(X n ) = MXn (0), thus the name ‘moment generating function.’
In particular,
MX0 (0) = E(X), MX00 (0) = E(X 2 )
and
V ar(X) = MX00 (0) − (MX0 (0))2
1
Example 1.1.1 Suppose X has range {1, 2, . . . , n} and PX (j) = for 1 ≤ j ≤ n. Then,
n
tX

MX (t) = E e
n
X 1 tj
= e
j=1
n
1 t
e + e2t + · · · + ent

=
n
(ent − 1)
= et .
n(et − 1)
Remark 1.1.1 We have used a formula for finding the first n terms of a geometric progression
which is given as:
rn − 1
Sn = a
r−1
where a is the first term and r the common ratio.
1.1. MOMENT GENERATING FUNCTIONS 3

Example 1.1.2 A discrete random variable X takes on values 0, 1, 2 with respective proba-
1 3 1
bilities , , . Find MX (t), the moment generating function of X at any point t and verify
2 8 8
that E(X) = MX0 (0) and E(X 2 ) = MX00 (0).
Solution :

MX (t) = E(eXt )
X2
= etx P (X = x)
x=0
1 3 t 1 2t
= + e + e .
2 8 8
3 t 2 2t
MX0 (t) = e + e .
8 8
3 t 4 2t
MX00 (t) = e + e .
8 8
3 2
MX0 (0) = +
8 8
5
= .
8
3 4
MX00 (0) = +
8 8
7
= .
8
2
X
E(X) = xP (X = x)
x=0
1 3 1
= 0× +1× +2×
2 8 8
3 2
= 0+ +
8 8
0
= MX (0).
X2
2
E(X ) = x2 P (X = x)
x=0
1 3 1
= 02 × + 12 × + 22 ×
2 8 8
3 4
= 0+ +
8 8
= MX00 (0).

Example 1.1.3 Let X be a random variable with probability density function


 −x
e for x > 0
f (x) =
0 otherwise.
ˆ ∞ ˆ ∞
tX tx −x
e(t−1)x dx.

MX (t) = E e = e e dx =
0 0
1
This integral is finite iff t < 1. Therefore, MX (t) = if t < 1.
1−t
4 CHAPTER 1. GENERATING FUNCTIONS

Since MX (t) is finite in an interval containing 0, then, all moments of X exist. Verify that the
first moment is 1 and the second moment is 2.

Example 1.1.4 The uniform distribution


The probability density function for a continuous random variable X having a uniform distri-
bution on the interval (a, b) is written as
 1
 a<x<b
b−a

f (x) =


0 elsewhere.

The moment generating function of X is

MX (t) = E etX

ˆ b tX
e
= dx
a b−a

1  tX b
= e a
t(b − a)
etb − eta
= .
(b − a)t

Can you determine the mean and variance of X from the moment generating function of X?
You will have to use L’Hopitals rule.

Theorem 1.1.1 Let X be a random variable with moment generating function MX (t).
If Y = aX + b where a and b are constant, then, MY (t) = ebt MX (at).

Proof .

MY (t) = E etY


= E et(aX+b)


= etb E(eatX )
= etb MX (at).

Example 1.1.5 A discrete random variable X is said to have the moment generating function

1 3 t 1 2t
φ(t) = + e + e
2 8 8

Let Y = 3X + 2. Determine the moment generating function of Y and use it to determine the
probability mass function of Y .
1.2. MOMENT GENERATING FUNCTIONS OF COMMON DISTRIBUTIONS 5

Solution :

MY (t) = E etY


= E e3tX+2t


= E e3tX e2t


= e2t E e3tX


= e2t MX (3t)
 
2t 1 3 3t 1 6t
= e + e + e
2 8 8
1 2t 3 5t 1 8t
= e + e + e
2 8 8
1 t 2 3 t 5 1 t 8
= e + e + e .
2 8 8
We derive the probability function of Y from the moment generating function of Y
by noting that the power of the exponent is the value that the random variable Y
takes on and the coefficient of the exponent is the corresponding probability.
Thus, the probability function of Y is shown in the table below.

y 2 5 8
1 3 1
fY (y)
2 8 8

Theorem 1.1.2 Uniqueness of the moment generating function.


Let X and Y be two random variables with identical moment generating functions for all values
of t in an interval around 0. Then, the distribution of X and Y must be identical. This theorem
is useful in identifying distribution functions of given functions of random variables.

1.2 Moment generating functions of common distribu-


tions
1.2.1 The Bernoulli and Binomial distributions
1. Let X be a Bernoulli random variable with parameter p.
The moment generating function of X is derived as follows:

MX (t) = E etX = et p + 1.q = q + pet .




Check that E(X) = p and V ar(X) = p(1 − p).

2. Binomial B(n, p). We know that if X ∼ B(n, p), then,


 
n x n−x
f (x) = p q for x = 0, 1, . . . n.
x
6 CHAPTER 1. GENERATING FUNCTIONS

The moment generating function of X is derived as follows:

MX (t) = E etX

n  
tx n
X
= e px q n−x
x=0
x
n  
X n x
= pet q n−x (sum of binomial coefficients)
x=0
x
n
= pet + q .

Check that E(X) = np and V ar(X) = npq.

1.2.2 The Geometric distribution


If X ∼ Ge(p), then, f (x, p) = pq x , for x = 0, 1, 2, . . .
where x is the number of failures before a success.
The moment generating function of X is derived as follows:

X
MX (t) = pq x etx
x=0
∞  
X
t x 1 1
= p (e q) = p provided that t < ln
x=0
1 − qet q

pqet
MX0 (t) = ,
(1 − qet )2
pq pq q
MX0 (0) = 2
= 2 = 2.
(1 − q) p p

q
Verify that the variance of a Geometric random variable is equal to .
p2

Remark 1.2.1
1. We have used a formula for obtaining a sum to infinity of a geometric series provided the

a
ari = 1−r
P
absolute of the common ratio is less than 1, i.e, |r| < 1. The sum is given as ,
i=0
where a is the first term and r is the common ratio.
2. The Geometric distribution can be given in another way as:
f (x, p) = pq x−1 , for x = 1, 2, 3, . . .
where x is the number of trials before a success. Work out the moment generating function of
X and use it to find its mean and variance in this case.

1.2.3 The Poisson distribution


The probability function of a random variable X having a Poisson distribution with parameter
λ > 0 is:
e−λ λx
f (x, λ) = ; for x = 0, 1, 2, . . .
x!
1.2. MOMENT GENERATING FUNCTIONS OF COMMON DISTRIBUTIONS 7

The moment generating function MX (t) of X is determined as follows:

MX (t) = E etX


−λ
X (et λ)x
= e
x=0
x!
t
= e−λ eλe

Verify that E(X) = MX0 (0) = λ and V ar(X) = MX00 (0) − (MX0 (0))2 = λ

1.2.4 The Normal distribution


The moment generating function for a random variable X having a normal distribution with
mean µ and variance σ 2 is determined as follows.

MX (t) = E etX

ˆ ∞
1 −1 (x−µ)
2
= √ etx e 2 σ2 dx
σ 2π −∞
ˆ ∞
1 1 2 2 2
= √ e− 2σ2 [x −2xµ+µ −2σ tx] dx
σ 2π −∞
ˆ ∞
1 1 2 2 2
= √ e− 2σ2 [x −2x(µ+σ t)+µ ] dx.
σ 2π −∞

By completion of squares, we get,


ˆ ∞
1 1 2 t+µ)]2 1 2 t+µ)2 − µ
2
MX (t) = √ e− 2σ2 [x−(σ e 2σ2 (σ 2σ 2 dx
σ 2π −∞
ˆ ∞
1 2 2 1 1 2 2
= e 2
σ t +µt
√ e− 2σ2 [x−(µ+tσ )] dx.
−∞ σ 2π
ˆ ∞
1 1 2 2
Since √ e− 2σ2 [x−(µ+tσ )] dx = 1, being the integral of a normal density with mean
−∞ σ 2π
µ + tσ 2 and variance σ 2 , we get
1 2 2
MX (t) = eµt+ 2 t σ
(1.1)

Verify that

1. MX (0) = 1.

2. MX0 (0) = µ, MX00 (0) = µ2 + σ 2 and V ar(X) = σ 2 .


If µ = 0 and σ = 1, we deduce from (1.1) that the moment generating function for a
standard normal distribution is
1 2
MZ (t) = e 2 t .

where Z ∼ N (0, 1)
8 CHAPTER 1. GENERATING FUNCTIONS

1.2.5 The Exponential distribution


If X is exponentially distributed with parameter λ, then,

MX (t) = E etX

ˆ ∞
= λetx e−λx dx
0
ˆ ∞
= λ e−(λ−t)x dx
0
λ
= .
λ−t
1 1
Provided t < λ. Verify that the mean of X is and the variance of X is 2 .
λ λ

1.2.6 The Gamma distribution


The moment generating function for random variable X having a gamma x
distribution with
α−1 − β
x e
parameters α and β (α > 0 and β > 0) and density f (x; α, β) = α for x, β, α > 0 is
β Γ(α)
determined as follows:
ˆ ∞ tx α−1 − βx
e x e
MX (t) = dx
β α Γ(α)
0
ˆ ∞
1 1
= α
xα−1 e−( β −t)x dx
β Γ(α) 0
1
=
(1 − βt)α
 
1
where the last equality is derived by letting z = − t x.
β
Thus,
αβ
MX0 (t) =
(1 − βt)α+1
and
α(α + 1)β 2
MX00 (t) = .
(1 − βt)α+2
Verify that E(X) = αβ and V ar(X) = αβ 2 .
Remark 1.2.2    −λx
1 λe x>0
1. If X ∼ Gamma 1, , then, f (x) =
λ 0 otherwise
which is the probability density function for the exponential distribution with parameter λ.
From the moment generating function for the gamma distribution, we can easily deduce
 that the
−1
λ 1
moment generating function for the exponential distribution is MX (t) = = 1− t .
λ−t λ
2. The Gamma distribution can be defined alternatively as:

xα−1 e−βx β α
f (x; α, β) = for x, β, α > 0.
Γ(α)
α α
The mean and variance of this Gamma distribution is β
and β2
, respectively.
1.3. FURTHER EXAMPLES 9

1.2.7 The Chi-square distribution


The Chi-square distribution is very useful in statistical inference. It is a special case of the
n
Gamma distribution with parameters α = and β = 2. The probability density function for
2
a Chi-square can be deduced from that of a Gamma distribution and it is written as

1 n x
f (x, n) = n n
x 2 −1 e− 2 ; x > 0.
Γ 2 2
2

Therefore, if X ∼ χ2(n) , then,

1 n
MX (t) = n = (1 − 2t)− 2 . (1.2)
(1 − 2t) 2

Verify that E(X) = n and V ar(X) = 2n.

1.2.8 The square of a standard normal


Let Y = Z 2 . Then,

MY (t) = E(etY )
2
= E(etZ )
ˆ ∞
2 1 −1 2
= etz √ e 2 z dz

ˆ−∞

1 1 2
= √ e− 2 (1−2t)z dz.
−∞ 2π

By letting s = ( 1 − 2t)z, we get
ˆ ∞
1 1 1 2
MY (t) = √ √ e− 2 s ds
1 − 2t −∞ 2π
 
1 1
= √ t< , (1.3)
1 − 2t 2
ˆ ∞
1 1 2
since √ e− 2 s ds = 1 being the integral of the p.d.f. for a standard normal over R. If we
−∞ 2π
compare (1.3) with (1.2), we deduce using Theorem 1.1.2, that the square of a standard normal
yields a chi-square with 1 degree of freedom.

1.3 Further examples


Example 1.3.1 Find the moment generating function of a random variable X with probability
function
λx e−λ
P (X = x) = , x = 1, 2, . . .
x! 1 − e−λ
Hence determine E(X) and V ar(X).
10 CHAPTER 1. GENERATING FUNCTIONS

Solution :

MX (t) = E etX


e−λ X λx etx
=
1 − e−λ x=1 x!

!
e−λ X λx etx
= −1
1 − e−λ x=0
x!

e−λ  λet 
= e − 1
1 − e−λ
e−λ  t λet 
MX0 (t) = λe e
1 − e−λ
e−λ  t λet 2 2t λet

MX00 (t) = λe e + λ e e
1 − e−λ

Then,

λ
E(X) = MX0 (0) =
1 − e−λ
(λ + λ2 )
2
E(X ) = MX00 (0) =
1 − e−λ

V ar(X) = MX00 (0) − (MX0 (0))2

λ λ2 e−λ
= − .
1 − e−λ (1 − e−λ )2

Example 1.3.2 Find the moment generating function for a continuous random variable X
e−λx
with values in [0, ∞) and density fX (x) = λ(λx)n−1 .
(n − 1)!

Solution :

MX (t) = E etX

ˆ ∞ n n−1
λ x
= e−λx etx dx
0 (n − 1)!
ˆ ∞
λn
= xn−1 e−(λ−t)x dx.
(n − 1)! 0
1.3. FURTHER EXAMPLES 11

y dy
Let y = (λ − t)x ⇒ x = and dx = . Therefore,
λ−t λ−t
ˆ ∞
n y n−1 −y dy
MX (t) = λ (n − 1)! e
0 (λ − t)n−1 λ−t
ˆ ∞
λn 1
= y n−1 e−y dy
(n − 1)! (λ − t)n 0
 n
1 λ
= Γ(n)
(n − 1)! λ − t
 n
λ (n − 1)!
=
λ−t (n − 1)!
 n
λ
= .
λ−t

Example 1.3.3 Find the moment generating function for a continuous random variable X
with values in [0, ∞) and density fX (x) = 4xe−2x and use it to determine the standard deviation
of X.

Solution :

MX (t) = E etX

ˆ ∞
= 4 xetx e−2x dx
ˆ0 ∞
= 4 xe−(2−t)x dx.
0

y dy
Let y = (2 − t)x ⇒ x = and dx = . Therefore,
2−t 2−t
ˆ ∞
y −y dy
MX (t) = 4 e
0 2−t 2−t
 2 ˆ ∞
2
= ye−y dy
2−t 0
 2
2
= Γ(2)
2−t
 2
2
= .
2−t
8
MX0 (t) = and
(2 − t)3
24
MX00 (t) = .
(2 − t)4
12 CHAPTER 1. GENERATING FUNCTIONS

Therefore,

E(X) = MX0 (0) = 1


V ar(X) = MX00 (0) − (MX0 (0))2
24
= −1
16
1
= .
2
p
σ(X) = V ar(X)
1
= √ .
2

Example 1.3.4 A random variable X is said to have the moment generating function MX (t) =
1 2 2
eµt+ 2 σ t , t ∈ R. Determine the variance of Y = eX .

Solution :
1 2 t2
MX (t) = MX (t) = eµt+ 2 σ , t ∈ R.
1 2
E(Y ) = E(eX ) = MX (1) = eµ+ 2 σ .
E Y 2 = E e2X
 
2
= MX (2) = e2µ+2σ .
V ar(Y ) = E(Y 2 ) − (EY )2
2 2
= e2µ+σ (eσ − 1).

Remark 1.3.1 If X ∼ N (µ, σ 2 ), then Y = eX is lognormally distributed. So, we have derived


the mean and variance of a lognormal distribution.

1.4 Probability generating functions


Definition 1.4.1 Let X be a non-negative integer valued random variable with values
x = 0, 1, 2, . . . The probability generating function of X is defined as:

X
X
GX (t) = E(t ) = tx P (X = x).
x=0

GX (t) is called
P the probability generating function of a random variable X. It is said to exist
if the series ∞ n=0 |t n
|P (X = n) is convergent.

1.4.1 Properties of the probability generating function


1. GX (0) = P (X = 0).
P∞
2. GX (1) = x=0 P (X = x) = 1.
1.4. PROBABILITY GENERATING FUNCTIONS 13

3. Term-wise differentiation of the probability generating function yields,



X
G0X (t) = xtx−1 P (X = x). (1.4)
x=1


X
G00X (t) = x(x − 1)tx−2 P (X = x). (1.5)
x=2

In general for k = 1, 2, . . . ,

X
(k)
GX (t) = x(x − 1) . . . (x − k + 1)tx−k P (X = x). (1.6)
x=k

(x)
Putting t = 0, in (1.4)-(1.6) gives GX (0) = x!P (X = x) from which we get

(x)
G (0)
P (X = x) = X
x!

The probability generating function generates probabilities hence the name.


P∞
4. Letting t = 1 in (1.4) gives, G0X (1) = x=1 xP (X = x) = E(X).

5. Letting t = 1 in (1.5) gives



X
G00X (1) = x(x − 1)P (X = x) = E(X(X − 1)) = E(X 2 ) − E(X).
x=1

The variance of X is
V ar(X) = G00X (1) + G0X (1) − (G0X (1))2 .

Theorem 1.4.1 If two nonnegative, integer-valued random variables have the same generating
function, then, they must follow the same probability law. In other words, if X and Y are
nonnegative, integer valued random variables. If GX = GY , then, PX = PY .

Example 1.4.1 If X ∼ Poisson (λ), then, the probability generating function of X is

GX (t) = E tX

∞ n −λ n
X t e λ
=
n=0
n!

−λ
X (λt)n
= e
n=0
n!
−λ λt
= e e .

Therefore,
GX (t) = e−λ(1−t) for all t.
14 CHAPTER 1. GENERATING FUNCTIONS

You can verify that

GX (1) = 1.
G0X (t) = λe−λ(1−t) , G0X (1) = λ and G0X (0) = e−λ .

G00X (t) = λ2 e−λ(1−t) , G00X (1) = λ2 and G00X (0) = e−λ λ.

(k)
GX (t) = λk e−λ(1−t) and
(k)
GX (0) = λk e−λ .

V ar(X) = G00X (1) + G0X (1) − (G0X (1))2


= λ.
λk
Since P (X = k) = e−λ , you can indeed verify that;
k!
k (k)
−λ λ G (0)
P (X = k) = e = X .
k! x!

Example 1.4.2 Let X be a Geometric random variable with probability function


f (x, p) = pq x x = 0, 1, 2, . . .

GX (t) = E tX


X∞
= tx pq x
x=0
X∞
= p tx q x
x=0
X∞
= p (tq)x .
x=0

p 1
This is a geometric series whose sum to infinity is (provided that |t| ≤ ). Verify that
1 − tq q
pq 2pq 2
G0X (t) = and G00
X (t) = from which the mean and variance of X can be
(1 − qt)2 (1 − qt)3
determined.

Exercise 1.1

1. Let X1 , X2 , . . . , Xn be an independent trials process, with uniform density function


(et − 1)
fX (x) = 1 and moment generating function m(t) = . Find in terms of m(t) the
t
moment generating functions for

(a) −X. (b) aX + b. (c) X + 1.

2. Use the moment generating function to show that if X follows an exponential distribution,
cX(c > 0) does also.
1.4. PROBABILITY GENERATING FUNCTIONS 15

3. Let X be a random variable with density


1
f (x) = e−|x| , −∞ < x < ∞.
2
1
Show that the moment generating function of X is , |t| < 1. Hence determine the
1 − t2
mean and variance of X.
4. Find the moment generating function for a distribution describing a fair die.
5. Find the moment generating function for a uniform distribution on the set
{n, n + 1, n + 2, . . . , n + k}.
6. Let X and Y be random variables with values in {1, 2, 3, 4, 5, 6} with probability functions
PX and PY given by PX (j) = aj and PY (j) = bj . Determine moment generating function
MX (t) and MY (t) for these distributions.
7. Let X be a continuous random variable with values in [0, ∞) and density
1
fX (x) = e − 2x + e−x . Find the moment generating function for X and hence find E(X)
2
and V ar(X).
8. Let X be a random variable with probability function given. Find their corresponding
probability generating functions.
(a) P (X = x) = nx px q n−x for x = 0, 1, 2, . . . , n.


e−λ λx
(b) P (X = x) = , x = 0, 1, . . .
x!
λx e−λ
(c) P (X = x) = , x = 1, 2, . . .
x! 1 − e−λ
(d) P (X = x) = pq x (1 − q N +1 ) − 1 for x = 0, 1, . . . , N.
(e) f (x) = x−1
 k x−k
k−1
p q , x = k, k + 1, . . .
For each case find E(X) and V ar(X).
9. Let X be a random variable with probability generating function GX (t) at a point t. Find
the probability generating function of Y = aX + b where a and b are non-negative integers.
10. Suppose that X is a random variable for which the moment generating function is as follows:
1
3et + e−t for − ∞ < t < ∞.

φ(t) =
4
Determine the mean and variance of X.
11. Suppose that X is a random variable for which the moment generating function is as follows:
2 +3t
φ(t) = et for − ∞ < t < ∞.
Determine the mean and variance of X.
12. Let X be a random variable with mean µ and variance σ 2 and let φ(t) denote the moment
generating function of X for −∞ < t < ∞. Let c be a given positive constant and let Y be
a random variable for which the moment generating function is
φ2 (t) = ec(φ1 (t)−1) for − ∞ < t < ∞.
Find expressions for the mean and variance of Y in terms of the mean and variance of X.
16 CHAPTER 1. GENERATING FUNCTIONS

13. Let X have a probability density function


( 1
if x > 1
f (x) = x2
0 otherwise.

Show that the moment generating function of X exists for t ≤ 0 but not for t > 0.

14. Suppose that X is a random variable for which the moment generating function is defined
as
1 2 2
M (t) = et + e4t + e8t .
5 5 5
(a) Find the probability function of X.
(b) Let Y = 3X −5. Determine the moment generating function of Y and use it to determine
the probability function of Y .

15. Suppose that X is a random variable for which the moment generating function is defined
as
1
M (t) = (4 + et + e−t ).
6
(a) Find the probability function of X.
(b) Let Y = aX + b, a, b ∈ R. Determine the moment generating function of Y and use it
to determine the probability function of Y.

16. Suppose that X is a random variable for which the moment generating function is defined
as
2
M (t) = e3t+8t .
1
Let Y = (X − 3). Determine the moment generating function of Y and use it to determine
4
the mean and variance of Y .
t
17. Explain why the function defined by M (t) = cannot be a moment generating function
1−t
of some random variable.

18. Let CX (t) = ln(MX (t)) where MX (t) is the moment generating function of a given random
0 00
variable X at a point t. Show that CX (0) = E(X) and CX (0) = V ar(X). Use these results
to determine the mean and variance of a random variable X having the moment generating
function defined by
t
MX (t) = e4(e −1) .

Remark 1.4.1 CX is called the cumulant generating function of X. It generates cumulants


for the random variable X. The first cumulant is the mean, the second cumulant is the variance
and the third cumulant is the third central moment of X.
References

1. Morris H. Degroot(1989). Probability and Statistics, Addison-Wesley Publishing Com-


pany, 2nd ed, Ch. 4.

2. John E. Freund(1992). Mathematical Statistics, Prentice-Hall, 5th ed. 4. John Rice(1995).


Mathematical Statistics and Data Analysis, Duxbury Press, 2nd ed.

3. Charles M. Grinstead, J. Laurie Snell: Introduction to Probability

You might also like