Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
130 views126 pages

Stochastic Processes 2

Uploaded by

Jesus R Arispuro
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
130 views126 pages

Stochastic Processes 2

Uploaded by

Jesus R Arispuro
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 126

Stochastic 

Processes 2 
Probability Examples c­9 
Leif Mejlbro 

Download free books at 
Leif Mejlbro

Probability Examples c-9


Stochastic Processes 2

Download free ebooks at bookboon.com

2
Probability Examples c-9 – Stochastic Processes 2
© 2009 Leif Mejlbro & Ventus Publishing ApS
ISBN 978-87-7681-525-7

Download free ebooks at bookboon.com

3
Stochastic Processes 2 Contents

Contents
Introduction 5
1 Theoretical background 6
1.1 The Poisson process 6
1.2 Birth and death processes 8
1.3 Queueing theory in general 11
1.4 Queueing system of innitely many shop assistants 11
1.5 Queueing system of a nite number of shop assistants, and with forming of queues 12
1.6 Queueing systems with a nite number of shop assistants and without queues 15
1.7 Some general types of stochastic processes 17
2 The Poisson process 19
3 Birth and death processes 37
4 Queueing theory 52
5 Other types of stochastic processes 117
Index 126

Fast-track
your career
Please click the advert

Masters in Management Stand out from the crowd


Designed for graduates with less than one year of full-time postgraduate work
experience, London Business School’s Masters in Management will expand your
thinking and provide you with the foundations for a successful career in business.
The programme is developed in consultation with recruiters to provide you with
the key skills that top employers demand. Through 11 months of full-time study,
you will gain the business knowledge and capabilities to increase your career
choices and stand out from the crowd.
London Business School Applications are now open for entry in September 2011.
Regent’s Park
London NW1 4SA
United Kingdom
Tel +44 (0)20 7000 7573
For more information visit www.london.edu/mim/
Email [email protected] email [email protected] or call +44 (0)20 7000 7573
www.london.edu/mim/

Download free ebooks at bookboon.com

4
Stochastic Processes 2 Introduction

Introduction
This is the ninth book of examples from Probability Theory. The topic Stochastic Processes is so big
that I have chosen to split into two books. In the previous (eighth) book was treated examples of
Random Walk and Markov chains, where the latter is dealt with in a fairly large chapter. In this book
we give examples of Poisson processes, Birth and death processes, Queueing theory and other types
of stochastic processes.

The prerequisites for the topics can e.g. be found in the Ventus: Calculus 2 series and the Ventus:
Complex Function Theory series, and all the previous Ventus: Probability c1-c7.

Unfortunately errors cannot be avoided in a first edition of a work of this type. However, the author
has tried to put them on a minimum, hoping that the reader will meet with sympathy the errors
which do occur in the text.

Leif Mejlbro
27th October 2009

Download free ebooks at bookboon.com

5
Stochastic Processes 2 1. Theoretical background

1 Theoretical background
1.1 The Poisson process
Given a sequence of independent events, each of them indicating the time when they occur. We
assume

1. The probability that an event occurs in a time interval I  [0, +∞[ does only depend on the length
of the interval and not of where the interval is on the time axis.
2. The probability that there in a time interval of length t we have at least one event, is equal to

λt + t ε(t),

where λ > 0 is a given positive constant.

3. The probability that we have more than one event in a time interval of length t is t ε(t).
It follows that

4. The probability that there is no event in a time interval of length is given by

1 − λt + tε(t).

5. The probability that there is precisely one event in a time interval of length t is λt + t ε(t).

Here ε(t) denotes some unspecified function, which tends towards 0 for t → 0.

© UBS 2010. All rights reserved.


You’re full of energy
and ideas. And that’s
just what we are looking for.
Please click the advert

Looking for a career where your ideas could really make a difference? UBS’s
Graduate Programme and internships are a chance for you to experience
for yourself what it’s like to be part of a global team that rewards your input
and believes in succeeding together.

Wherever you are in your academic career, make your future a part of ours
by visiting www.ubs.com/graduates.

www.ubs.com/graduates

Download free ebooks at bookboon.com

6
Stochastic Processes 2 1. Theoretical background

Given the assumptions on the previous page, we let X(t) denote the number of events in the interval
]0, t], and we put

Pk (t) := P {X(t) = k}, for k ∈ N0 .

Then X(t) is a Poisson distributed random variable of parameter λt. The process

{X(t) | t ∈ [0, +∞[}

is called a Poisson process, and the parameter λ is called the intensity of the Poisson process.

Concerning the Poisson process we have the following results:

1) If t = 0, (i.e. X(0) = 0), then



⎨ 1, for k = 0,
Pk =

0, for k ∈ N.

2) If t > 0, then Pk (t) is a differentiable function, and



⎨ λ {Pk−1 (t) − Pk (t)} , for k ∈ N and t > 0,
Pk (t) =

−λ P0 (t), for k = 0 and t > 0.

When we solve these differential equations, we get

(λt)k −λt
Pk (t) = e , for k ∈ N0 ,
k!
proving that X(t) is Poisson distributed with parameter λt.

Remark 1.1 Even if Poisson processes are very common, they are mostly applied in the theory of
tele-traffic. ♦.

If X(t) is a Poisson process as described above, then X(s + t) − X(s) has the same distribution as
X(t), thus

(λt)k −λt
P {X(s + t) − X(s)} = e , for k ∈ N0 .
k!
If 0 ≤ t1 < t2 ≤ t3 < t4 , then the two random variables X (t4 ) − X (t3 ) and X (t2 ) − X (t1 ) are
independent. We say that the Poisson process has independent and stationary growth.

The mean value function of a Poisson process is

m(t) = E{X(t)} = λt.

The auto-covariance (covariance function) is given by

C(s, t) = Cov(X(s) , X(t)) = λ min{s, t}.

Download free ebooks at bookboon.com

7
Stochastic Processes 2 1. Theoretical background

The auto-correlation is given by

R(s, t) = E{X(s) · X(t)} = λ min(s, t) + λ2 st.

The event function of a Poisson process is a step function with values in N 0 , each step of the size
+1. We introduce the sequence of random variables T1 , T2 , . . . , which indicate the distance in time
between two succeeding events in the Poisson process. Thus

Yn = T 1 + T 2 + · · · + T n

is the time until the n-th event of the Poisson process.

Notice that T1 is exponentially distributed of parameter λ, thus

P {T1 > t} = P {X(t) = 0} = e−λt , for t > 0.

All random variables T1 , T2 , . . . , Tn are mutually independent and exponentially distributed of pa-
rameter λ, hence

Yn = T 1 + T 2 + · · · + T n
 
1
is Gamma distributed, Yn ∈ Γ n , .
λ
Connection with Erlang’s B-formula. Since Yn+1 > t, if and only if X(t) ≤ n, we have

P {X(t) ≤ n} = P {Yn+1 > t} ,

from which we derive that


n
  +∞
(λt)k λn+1
e−λt = y n e−λy dy.
k! n! t
k=1

We have in particular for λ = 1,


n 
tk et +∞ n −y
= y e dy, n ∈ N0 .
k! n! t
k=0

1.2 Birth and death processes


Let {X(t) | t ∈ [0, +∞ [} be a stochastic process, which can be in the states E0 , E1 , E2 , . . . . The
process can only move from one state to a neighbouring state in the following sense: If the process is
in state Ek , and we receive a positive signal, then the process is transferred to Ek+1 , and if instead
we receive a negative signal (and k ∈ N), then the process is transferred to E k−1 .

We assume that there are non-negative constants λk and μk , such that for k ∈ N,
1) P {one positive signal in ] t, t + h [| X(t) = k} = λk h + h ε(h).
2) P {one negative signal in ] t, t + h [| X(t) = k} = μk h + h ε(h).

Download free ebooks at bookboon.com

8
Stochastic Processes 2 1. Theoretical background

3) P {no signal in ] t, t + h [| X(t) = k} = 1 − (λk + μk ) h + h ε(h).


We call λk the birth intensity at state Ek , and μk is called the death intensity at state Ek , and the
process itself is called a birth and death process. If in particular all μk = 0, we just call it a birth
process, and analogously a death process, if all λk = 0.

A simple analysis shows for k ∈ N and h > 0 that the event {X(t + h) = k} is realized in on of the
following ways:
• X(t) = k, and no signal in ] t, t + h [.
• X(t) = k − 1, and one positive signal in ] t, t + h [.
• X(t) = k + 1, and one negative signal in ] t, t + h [.
• More signals in ] t, t + h [.
We put

Pk (t) = P {X(t) = k}.

By a rearrangement and taking the limit h → 0 we easily derive the differential equations of the
process,
⎧ 
⎨ P0 (t) = −λ0 P0 (t) + μ1 P1 (t), for k = 0,

Pk (t) = − (λk + μk ) Pk (t) + λk−1 Pk−1 (t) + μk+1 Pk+1 (t), for k ∈ N.

In the special case of a pure birth process, where all μk = 0, this system is reduced to
⎧ 
⎨ P0 (t) = −λ0 P0 (t), for k = 0,

Pk (t) = −λk Pk (t) + λk−1 Pk−1 (t), for k ∈ N.

If all λk > 0, we get the following iteration formula of the complete solution,

⎨ P0 (t) = c0 e−λ0 t , for k = 0,
⎩ t
Pk (t) = λk−1 e−λk t 0
eλk τ Pk−1 (τ ) dτ + ck e−λk t , for k ∈ N.

From P0 (t) we derive P1 (t), etc.. Finally, if we know the initial distribution, we are e.g. at time t = 0
in state Em , then we can find the values of the arbitrary constants ck .

Let {X(t) | t ∈ [0, +∞[} be a birth and death process, where all λk and μk are positive, with the
exception of μ0 = 0, and λN = 0, if there is a final state EN . The process can be in any of the states,
therefore, in analogy with the Markov chains, such a birth and death process is called irreducible.
Processes like this often occur in queueing theory.

If there exists a state Ek , in which λk = μk , then Ek is an absorbing state, because it is not possible
to move away from Ek .

For the most common birth and death processes (including all irreducible processes) there exist non-
negative constants pk , such that

Pk (t) → pk and Pk (t) → 0 for t → +∞.

Download free ebooks at bookboon.com

9
Stochastic Processes 2 1. Theoretical background

These constants fulfil the infinite system of equations,

μk+1 pk+1 = λk pk , for k ∈ N0 ,

which sometimes can be used to find the pk .

If there is a solution (pk ), which satisfies


+∞

pk ≥ 0 for all k ∈ N0 , and pk = 1,
k=0

we say that the solution (pk ) is a stationary distribution, and the pk are called the stationary proba-
bilities. In this case we have

Pk (t) → pk for t → +∞.

If {X(t) | t ∈ [0, +∞[} is an irreducible process, then

λk−1 λk−2 · · · λ1 λ0
pk = · p0 := ak p0 , for k ∈ N0 ,
μk μk−1 · · · μ2 μ1

where all ak > 0.


The condition of the existence of a stationary distribution is then reduced to that the series k ak is
1
convergent of finite sum a > 0. In this case we have p0 = .
a
Please click the advert

Download free ebooks at bookboon.com

10
Stochastic Processes 2 1. Theoretical background

1.3 Queueing theory in general


Let {X(t) | t ∈ [0, +∞[} be a birth and death process as described in the previous section. We
shall consider them as services in a service organization, where “birth” corresponds to the arrival of a
new customer, and “death” correspond to the ending of the service of a customer. We introduce the
following:

1) By the arrival distribution (the arrival process) we shall understand the distribution of the arrivals
of the customers to the service (the shop). This distribution is often of Poisson type.
2) It the arrivals follow a Poisson process of intensity λ, then the random variable, which indicates
the time difference between two succeeding arrivals exponentially distributed of parameter λ. We
say that the arrivals follow an exponential distribution, and λ is called the arrival intensity.
3) The queueing system is described by the number of shop assistants or serving places, if there is
the possibility of forming queues or not, and the way a queue is handled. The serving places are
also called channels.
4) Concerning the service times we assume that if a service starts at time t, then the probability that
it is ended at some time in the interval ]t, t + h[ is equal to

μ h + h ε(h), where μ > 0.

Then the service time is exponentially distributed of parameter μ.


If at time t we are dealing with k (mutually independent) services, then the probability that one
of these is ended in the interval ]t, t + h[ equal to

k μ h + h ε(h).

We shall in the following sections consider the three most common types of queueing systems. Concern-
ing other types, cf. e.g. Villy Bæk Iversen: Teletraffic Engineering and Network Planning Technical
University of Denmark.

1.4 Queueing system of infinitely many shop assistants


The model is described in the following way: Customers arrive to the service according a Poisson
process of intensity λ, and they immediately go to a free shop assistant, where they are serviced
according to an exponential distribution of parameter μ.

The process is described by the following birth and death process,

{X(t) | t ∈ [0, +∞[} med λk = λ and μk = k μ for alle k.

The process is irreducible, and the differential equations of the system are given by
⎧ 
⎨ P0 (t) = −λ P0 (t) + μ P1 (t), for k = 0,

Pk (t) = −(λ + k μ)Pk (t) + λ Pk−1 (t) + (k + 1)μ Pk+1 (t), for k ∈ N.

The stationary probabilities exist and satisfy the equations

(k + 1)μ pk+1 = λ pk , k ∈ N0 ,

Download free ebooks at bookboon.com

11
Stochastic Processes 2 1. Theoretical background

of the solutions
 k  
1 λ λ
pk = exp − , k ∈ N0 .
k! μ μ

These are the probabilities that there are k customers in the system, when we have obtained equilib-
rium.

The system of differential equations above is usually difficult to solve. One has, however, some partial
results, e.g. the expected number of customers at time t, i.e.
+∞

m(t) := k Pk (t),
k=1

satisfies the simpler differential equation

m (t) + μ m(t) = λ.

If at time t = 0 there is no customer at the service, then


λ
m(t) = 1 − e−μt , for t ≥ 0.
μ

1.5 Queueing system of a finite number of shop assistants, and with form-
ing of queues
We consider the case where

1) the customers arrive according to a Poisson process of intensity λ,

2) the service times are exponentially distributed of parameter μ,


3) there are N shop assistants,

4) it is possible to form queues.

Spelled out, we have N shop assistants and a customer, who arrives at state E k . If k < N , then the
customer goes to a free shop assistant and is immediately serviced. If however k = N , thus all shop
assistants are busy, then he joins a queue and waits until there is a free shop assistant. We assume
here queueing culture.

With a slight change of the notation it follows that if there are N shop assistants and k customers
(and not k states as above), where k > N , then there is a common queue for all shop assistants
consisting of k − N customers.

This process is described by the following birth and death process {X(t) | t ∈ [0, +∞[} of the
parameters

⎨ k μ, for k < N,
λk = λ and μk =

N μ, for k ≥ N.

Download free ebooks at bookboon.com

12
Stochastic Processes 2 1. Theoretical background

The process is irreducible. The equations of the stationary probabilities are



⎨ (k + 1)μ pk+1 = λ pk , for k < N,

N μ pk+1 = λ pk , for k ≥ N.

We introduce the traffic intensity by


λ
 := .

Then we get the stationary probabilities
⎧  k

⎪ λ 1 k ·N k

⎪ · p0 = k! · p0 , for k < N,
⎨ μ k!
pk =

⎪  k

⎪ k · N N
⎩ λ 1
· k−N · p0 = · p0 , for k ≥ N.
μ N · N! N!

Remark 1.2 Together with the traffic intensity one also introduce in teletraffic the offer of traffic.
By this we mean the number of customers who at the average arrive to the system in a time interval of
λ
length equal to the mean service time. In the situation above the offer of traffic is . Both the traffic
μ
intensity and the offer of traffic are dimensionless. They are both measured in the unit Erlang.♦

The condition that (pk ) become stationare probabilities is that the traffic intensity  < 1, where
+∞
 NN k ( N )N
 = .
N! (1 − ) · N !
k=N

If, however,  ≥ 1, it is easily seen that the queue is increasing towards infinity, and there does not
exist a stationary distribution.

We assume in the following that  < 1, so the stationary probabilities exist

1) If N = 1, then

pk = k (1 − ), for k ∈ N0 .

2) If N = 2, then

⎪ 1−

⎪ , for k = 0,
⎨ 1+
pk =

⎪ 1−

⎩ 2k · , for k ∈ N.
1+

3) If N > 2, the formulæ become somewhat complicated, so they are not given here.

Download free ebooks at bookboon.com

13
Stochastic Processes 2 1. Theoretical background

The average number of customers at the service is under the given assumptions,
⎧ 

⎨ 1 − , for N = 1,



+∞
k=1 k pk , generelt (naturligvis).

The average number of busy shop assistants is



⎨ , for N = 1,

N −1
+∞
k=1 k pk + N k=N pk , in general.

your chance
to change
the world
Please click the advert

Here at Ericsson we have a deep rooted belief that


the innovations we make on a daily basis can have a
profound effect on making the world a better place
for people, business and society. Join us.

In Germany we are especially looking for graduates


as Integration Engineers for
• Radio Access and IP Networks
• IMS and IPTV

We are looking forward to getting your application!


To apply and for all current job openings please visit
our web page: www.ericsson.com/careers

Download free ebooks at bookboon.com

14
Stochastic Processes 2 1. Theoretical background

The waiting time of a customer is defined as the time elapsed from his arrival to the service of him
starts. The staying time is the time from his arrival until he leaves the system after the service of
him. Hence we have the splitting

staying time = waiting time + service time.

The average waiting time is in general given by


+∞
 k−N +1
V = pk ,

k=N

which by a computation is
⎧ 

⎪ , for N = 1,

⎨ μ(1 − )
V =

⎪ N · N N −1

⎩ · p0 , generelt.
μ · N ! · (1 − )2

In the special case of N = 1 the average staying time is given by


 1 1
O= + = .
μ(1 − ) μ μ−λ

The average length of the queue (i.e. the mean number of customers in the queue) is
+∞
 N +1 · N N
λV = (k − N )pk = · p0 .
N ! · (1 − )2
k=N +1

1.6 Queueing systems with a finite number of shop assistants and without
queues
We consider here the case where

1) the customers arrive according to a Poisson process of intensity λ,


2) the times of service are exponential distributed of parameter μ,
3) there are N shop assistants or channels,

4) it is not possible to form a queue.


The difference from the previous section is that if a customer arrives at a time when all shop assistants
are busy, then he immediately leaves the system. Therefore, this is also called a system of rejection.

Download free ebooks at bookboon.com

15
Stochastic Processes 2 1. Theoretical background

In this case the process is described by the following birth and death process {X(t) | t ∈ [0, +∞[}
with a finite number of states E0 , E1 , . . . , EN , where the intensities are given by

⎨ λ, for k < N,
λk = and μk = k μ.

0, for k ≥ N,

This process is also irreducible. The corresponding system of differential equations is


⎧ 

⎪ P0 (t) = −λ P0 (t) + μ P1 (t), for k = 0,



Pk (t) = −(λ + k μ)Pk (t) + λ Pk−1 (t) + (k + 1)μ Pk+1 (t), for 1 ≤ k ≤ N − 1,




⎩ 
PN (t) = −N μ PN (t) + λ PN −1 (t), for k = N.

In general, this system is too complicated for a reasonable solution, so instead we use the stationary
probabilities, which are here given by Erlang’s B-formula:
 k

k!
μ
pk =   , for k = 0, 1, 2, . . . , N.

N 1 λ j
j=0
j! μ

The average number of customers who are served, is of course equal to the average number of busy
shop assistants, or channels. The common value is
N
 λ
k pk = (1 − pN ) .
μ
k=1

We notice that pN can be interpreted as the probability of rejection. This probability pN is large,
when λ >> μ. We get from
 
λ
N  j exp  +∞
1 λ μ
= y N e−y dy,
j=0
j! μ N ! λ/μ

the probability of rejection


 N  N  
1 λ λ λ
exp −
N! μ μ μ
pN =  j = +∞ N −y .

N 1 λ λ/μ
y e dy
j=0
j! μ

Download free ebooks at bookboon.com

16
Stochastic Processes 2 1. Theoretical background

1.7 Some general types of stochastic processes


Given two stochastic processes, {X(t) | t ∈ T } and {Y (s) | s ∈ T }, where we assume that all the
moments below exist. We define
1) the mean value function,

m(t) := E{X(t)}, for t ∈ T,

2) the auto-correlation,

R(x, t) := E{X(s)X(t)}, for s, t ∈ T,

3) the auto-covariance,

C(s, t) := Cov(X(s), X(t)), for s, t ∈ T,

4) the cross-correlation,

RXY (s, t) := E{X(s)Y (t)}, for s, t ∈ T,

5) the cross-covariance,

CXY (s, t) := Cov(X(s), Y (t)), for s, t ∈ T.

A stochastic process {X(t) | t ∈ R} is strictly stationary, if the translated process {X(t + h) | t ∈ R}


for every h ∈ R has the same distribution as {X(t) | t ∈ R}.

In this case we have for all n ∈ N, all x1 , . . . , xn ∈ R, and all t1 , . . . , tn ∈ R that

P {X (t1 + h) ≤ x1 ∧ · · · ∧ X (tn + h) ≤ xn }

does not depend on h ∈ R.

Since P {X(t) ≤ x} does not depend on t for such a process, we have

m(t) = m,

and the auto-covariance C(s, t) becomes a function in the real variable s − t. We therefore write in
this case,

C(s, t) := C(s − t).

Analogously, the auto-correlation is also a function only depending on s and t, so we write

R(s, t) := R(s − t).

Conversely, if m(t) = m and C(s, t) = C(s − t), then we call the stochastic process {X(t) | t ∈ R}
weakly stationary.

Download free ebooks at bookboon.com

17
Stochastic Processes 2 1. Theoretical background

Let us consider a stochastic process {X(t) | t ∈ R} of mean 0 and auto-correlation

R(τ ) = E{X(t + τ )X(t)}.

If R(τ ) is absolutely integrable, we define the effect spektrum by


 +∞
S(ω) = eiωτ R(τ ) dτ,
−∞

i.e. as the Fourier transformed of R(τ ). Furthermore, if we also assume that S(ω) is absolutely
integrable, then we can apply the Fourier inversion formula to reconstruct R(τ ) from the effect
spectrum,
 +∞
1
R(τ ) = e−iωτ S(ω) dω.
2π −∞

In particular,
 +∞
  1
E |X(t)|2 = R(0) = S(ω) dω.
2π −∞

A stochastic process {X(t) | t ∈ T } is called a normal process, or a Gaußiann process, if for every
n ∈ N and every t1 , . . . , tn ∈ T the distribution of {X (t1 ) , . . . , X (tn )} is an n-dimensional normal
distribution. A normal process is always completely specified by its mean value function m(t) and its
auto-covariance function C(s, t).

The most important normal process is the Wiener process, or the Brownian movements

{W (t) | t ≥ 0}.

This is characterized by
1) W (0) = 0,

2) m(t) = 0,

3) V {W (t)} = α t, where α is a positive constant,


4) mutually independent increments.

Download free ebooks at bookboon.com

18
Stochastic Processes 2 2. The Poisson process

2 The Poisson process


Example 2.1 Let {X(t), t ∈ [0, ∞[} be a Poisson process of intensity λ, and let the random variable
T denote the time when the first event occurs.
Find the conditional distribution of T , given that at time t0 precisely one event has occurred, thus find

P {T ≤ t | X (t0 ) = 1} .

When t ∈ [0, t0 ], then the conditional distribution is given by


P {X(t) = 1 ∧ X (t0 ) = 1} P {X(t) = 1 ∧ X (t0 ) − X(t) = 0}
P {T ≤ t | X (t0 ) = 1} = =
P {X (t0 ) = 1} P {X (t0 ) = 1}
P {X(t) = 1} · P {X (t0 ) − X(t) = 0} λ t e−λt · e−λ(t0 −t) t
= = = ,
P {X (t0 ) = 1} λ t0 e−λ t0 t0
because
(λ t)k −λ t
Pk (t) = P {X(t) = k} = e , k ∈ N,
k!
and where we furthermore have applied that X (t0 ) − X(t) has the same distribution as X (t0 − t).

The conditional distribution is a rectangular distribution over ]0, t0 [.

what‘s missing in this equation?


Please click the advert

You could be one of our future talents

MAERSK INTERNATIONAL TECHNOLOGY & SCIENCE PROGRAMME


Are you about to graduate as an engineer or geoscientist? Or have you already graduated?
If so, there may be an exciting future for you with A.P. Moller - Maersk.

www.maersk.com/mitas

Download free ebooks at bookboon.com

19
Stochastic Processes 2 2. The Poisson process

Example 2.2 Let {X1 (t), t ≥ 0} and {X2 (t), t ≥ 0} denote two independent Poisson processes of
intensity λ1 and λ2 , resp., and let the process {Y (t), t ≥ 0} be defined by
Y (t) = X1 (t) + X2 (t).
Prove that {Y (t), t ≥ 0} is a Poisson process.

We first identify
n
(λ1 t) −λ1 t
Pn (t) = P {X(t) = n} = e ,
n!
and
n
(λ2 t) −λ2 t
Qn (t) = P {X(t) = n} = e .
n!
We get from X1 (t) and X2 (t) being independent that
P {Y (t) = n} = P {X1 (t) + X2 (t) = n}
n n j n−j
(λ1 t) −λ1 t (λ2 t)
= P {X1 (t) = j} · P {X2 (t) = n − j} = e · e−λ2 t
j=0 j=0
j! (n − j)!
n n n  
n! t −(λ1 +λ2 )t n tn −(λ1 +λ2 )t
= λj1 · λn−j
2 · e = λj1 λn−j
2 · e
j!(n − j)!
j=0
n! j
j=0
n!
n
n t
= (λ1 + λ2 ) · · exp (− (λ1 + λ2 ) t) .
n!
It follows that {Y (t), t ≥ 0} is also a Poisson process (of intensity λ 1 + λ2 ).

Example 2.3 A Geiger counter only records every second particle, which arrives to the counter.
Assume that the particles arrive according to a Poisson process of intensity λ. Denote by N (t) the
number of particles recorded in ]0, t], where we assume that the first recorded particle is the second to
arrive.
1. Find P {N (t) = n}, n ∈ N0 .
2. Find E{N (t)}.
Let T denote the time difference between two succeeding recorded arrivals.
3. Find the frequency of T .
4. Find the mean E{T }.

1. It follows from
(λt)n −λt
Pn (t) = e , n ∈ N0 ,
n!
that
 
(λt)2n (λt)2n+1
P {N (t) = n} = P2n (t) + P2n+1 (t) = + e−λt
(2n)! (2n + 1)!
(λt)2n
= (2n + 1 + λt)e−λt , n ∈ N0 .
(2n + 1)!

Download free ebooks at bookboon.com

20
Stochastic Processes 2 2. The Poisson process

2. The mean is

∞ ∞

  n(λt)2n  n(λt)2n+1
−λt
E{N (t)} = n P {N (t) = n} = e +
n=0 n=1
(2n)! n=1
(2n + 1)!

 ∞ ∞

λt  (λt)2n+1
−λt
 (n + 12 )(λt)2n+1 1  (λt)2n+1
= e + −
2 n=0 (2n + 1)! n=1 (2n + 1)! 2 n=1 (2n + 1)!
 
−λt λt λt 1
= e · sinh λt + (cosh λt − 1) − (sinh λt − λt)
2 2 2
 
λt λt 1 λt λt 1 1 −2λt
= e−λt ·e − e − e−λt = − + e .
2 4 2 4 4


1
3. & 4. It follows from T = Tj + Tj+1 that T ∈ Γ 2 , , thus the frequency is
λ
⎧ 2 −λx
⎨ λ xe for x > 0,
f (x) =

0 for x ≤ 0,

and the mean is


2
E{T } = .
λ

Example 2.4 From a ferry port a ferry is sailing every quarter of an hour. Each ferry can carry N
cars. The cars are arriving to the ferry port according to a Poisson process of intensity λ (measured
in quarter−1 ).
Assuming that there is no car in the ferry port immediately after a ferry has sailed at 9 00 , one shall
1) find the probability that there is no car waiting at 915 (immediately after the departure of the next
ferry),
2) find the probability that no car is waiting at 930 (immediately after the departure of the next ferry).
1
3) A motorist arrives at p07 2 . What is the probability that he will not catch the ferry at p15 , but
instead the ferry at 930 ?

Measuring t in the unit quarter of an hour we have

(λt)n −λt
P {X(t) = n} = e , n ∈ N0 .
n!
1) From t = 1 follows that the wanted probability is

N
λn −λ
P {X(1) ≤ N } = e .
n=0
n!

2) We have two possibilities:

Download free ebooks at bookboon.com

21
Stochastic Processes 2 2. The Poisson process

a) Either there has arrived during the first quarter of an hour ≤ N cars, which are all carried
over, so we allow during the next quarter N cars to arrive,
b) or during the first quarter N + j cars have arrived, 1 ≤ j ≤ N , and at most N − j cars in the
second quarter.

We therefore get the probability

N

P {X(1) ≤ N } · P {X(1) ≤ N } + P {X(1) = N + j} · P {X(1) ≤ N − j}
j=1
 2 ⎧ 2 ⎫
N
λn −λ N
λN +j
N
 −j n
λ ⎨ N
λ n N N−j
λ N +j+n ⎬
= e + e−λ · e−λ = e−2λ +
n! (N + j)! n! ⎩ n! n!(N + j)! ⎭
n=0 j=1 n=0 n=0 j=1 n=0
⎧ 2 N −1 N −n ⎫
⎨  N
λ n   λN +j+n ⎬
= e−2λ + .
⎩ n! n!(N + j)! ⎭
n=0 n=0 j=1

1
3) Now the time 907 2 corresponds to t = 12 , so the probability is

N
       2N  n
1 λ  1 λ
P X = N + j = exp − .
j=0
2 2 n! 2
n=N

Example 2.5 Paradox of waiting time.


Each morning Mr. Smith in X-borough takes the bus to his place of work. The busses of X-borough
should according to the timetables run with an interval of 20 minutes. It is, however, well-known in
X-borough that the busses mostly arrive at random times to the bus stops (meaning mathematically
1
that the arrivals of the busses follow a Poisson process of intensity λ = min−1 , because the average
20
time difference between two succeeding busses is 20 minutes).
One day when Mr. Smith is waiting extraordinary long time for his bus, he starts reasoning about how
long time he at the average must wait for the bus, and he develops two ways of reasoning:

1) The time distance between two succeeding buses is exponentially distributed of mean 20 minutes,
and since the exponential distribution is “forgetful”, de average waiting time must be 20 minutes.

2) He arrives at a random time between two succeeding busses, so by the “symmetry” the average
waiting time is instead 12 · 20 minutes = 10 minutes.
At this time Mr. Smith’s bus arrives, and he forgets to think of this contradiction.
Can you decide which of the two arguments is correct and explain the mistake in the wrong argument?

The argument of (1) is correct. The mistake of (2) is that the length of the time interval, in which
Mr. Smith arrives, is not exponentially distributed. In fact, there will be a tendency of Mr. Smith to
arrive in one of the longer intervals.

This is more precisely described in the following way. Let t denote Mr. Smith’s arrival time. Then

Download free ebooks at bookboon.com

22
Stochastic Processes 2 2. The Poisson process

1)

P {wait in more than x minutes} = P {N (t + x) − N (t) = 0} = P {N (x) = 0} = e −λx .


1
This shows that the waiting time is exponentially distribution of the mean = 20 minutes.
λ
2) Let X1 , X2 , . . . , denote the lengths of the succeeding intervals between the arrivals of the busses.
By the assumptions, the Xk are mutually independent and exponentially distributed of parameter
λ.
Put
n

Sn = Xk .
k=1

The surprise is that the Xk , for which


k
 k+1

Sk = Xk < t < Xj = Sk+1 ,
j=1 j+1

have the frequency


⎧ 2 −λx
⎨ λ xe , 0 < x ≤ t,
(1) ft (x) =

λ(1 + λt)e−λx , t < x.

Turning a challenge into a learning curve.


Just another day at the office for a high performer.

Accenture Boot Camp – your toughest test yet


Please click the advert

Choose Accenture for a career where the variety of opportunities and challenges allows you to make a
difference every day. A place where you can develop your potential and grow professionally, working
alongside talented colleagues. The only place where you can learn from our unrivalled experience, while
helping our global clients achieve high performance. If this is your idea of a typical working day, then
Accenture is the place to be.

It all starts at Boot Camp. It’s 48 hours packed with intellectual challenges and intense learning experience.
that will stimulate your mind and and activities designed to let you It could be your toughest test yet,
enhance your career prospects. You’ll discover what it really means to be a which is exactly what will make it
spend time with other students, top high performer in business. We can’t your biggest opportunity.
Accenture Consultants and special tell you everything about Boot Camp,
guests. An inspirational two days but expect a fast-paced, exhilarating Find out more and apply online.

Visit accenture.com/bootcamp

Download free ebooks at bookboon.com

23
Stochastic Processes 2 2. The Poisson process

We shall now prove (1). First notice that the frequencies of the Sn are given by

λn
gn (x) = xn−1 e−λx , x > 0.
(n − 1)!

(a) First assume that x < t. Then the even occurs that the interval has the length ≤ x, if

Sn = y and t − y < Xn+1 ≤ x,

for some combination of n and y, where t − x < y ≤ t.


Then
∞  t   
   t ∞
  
−λ(t−y) −λx
Ft (x) = gn (y) e −e dy = gn (y) · e−λ(t−y) − e−λx dy
n=1 t−x t−x n=1
 t  t
 
= λ e−λt eλy − e −λx
dy = λe−λt eλy dy − λxe−λx = 1 − e−λx − λxe−λx ,
t−x t−x

where we have used that



 ∞
(λy)n−1 −λy
gn (y) = λ e = λ.
n=1 n=1
(n − 1)!

By a differentiation,

ft (x) = λ2 xe−λx for x ≤ t.

(b) Then let x > t. The event occurs that the interval has length ≤ x, if either

Sn = y and t − y < Xn+1 ≤ x

for some combination of n and y, or if S1 ∈ [t, x].


Then
∞  t    
Ft (x) = gn (y) e−λ(t−y) − e−λx dy + e−λt − e−λx
n=1 0
 t   
= λ e−λ(t−y) − e−λx dy + e−λt − e−λx
0
= 1 − e−λt − λte−λx + e−λt − e−λx = 1 − (1 + λt)e−λx .
By differentiation,

ft (x) = λ(1 + λt)e−λx , for x > t.

Download free ebooks at bookboon.com

24
Stochastic Processes 2 2. The Poisson process

We have now found the distribution, so we can compute the mean


 ∞  t  ∞
μ(t) = xft (x) dx = λ2 x2 e−λx dx + λx(1 + λt)e−λx dx
0 0 t
 t  ∞
 t
= −λx3 e−λx 0 + 2 λxe−λx dx + (1 + λt) λxe−λx dx
0 t
 t  ∞
 1 1
= −λx2 e−λx −2xe−λx 0 +2 e−λx dx+(1+λt) −xe−λx − e−λx
0 λ
  t
2 −λt −λt 2 −λt
−λt 1 −λt
= −λt e − 2te + 1−e + (1 + λt) te + e
λ λ
2 2 1
= −λt2 e−λt −2te−λt + − e−λt +te−λt + e−λt +λt2 e−λt +te−λt
λ λ λ
2 1 −λt
= − e .
λ λ
An interpretation of this result is that for large values of t, i.e. when the Poisson process has been
2
working for such a long time that some buses have arrived, then the mean is almost equal to , and
λ
1
definitely not , which Mr. Smith tacitly has used in his second argument.
λ



    
  
Please click the advert

In Paris or Online
International programs taught by professors and professionals from all over the world

BBA in Global Business


MBA in International Management / International Marketing
DBA in International Business / International Management
MA in International Education
MA in Cross-Cultural Communication
MA in Foreign Languages

Innovative – Practical – Flexible – Affordable

Visit: www.HorizonsUniversity.org
Write: [email protected]
Call: 01.42.77.20.66 www.HorizonsUniversity.org

Download free ebooks at bookboon.com

25
Stochastic Processes 2 2. The Poisson process

Example 2.6 Denote by {X(t), t ≥ 0} a Poisson process of intensity a, and let ξ be a fixed positive
number. We define a random variable V by

V = inf{v ≥ ξ | there is no event from the Poisson process in the interval ]v − ξ, v]}.

tau_1 tau_2 tau_3 tau_4

xi V

(On the figure the τi indicate the times of the i-th event of the Poisson process, V the first time when
we have had an interval of length ξ without any event).
1) Prove that the distribution function F (v) of V fulfils
⎧ ξ
⎨ e−aξ + 0 F (v − x) a e−ax dx, v ≥ ξ,
(2) F (v) =

0, v < ξ.

2) Prove that the Laplace transform of V is given by

(a + λ)e−(a+λ)ξ
L(λ) = .
λ + a e−(a+λ)ξ
Hint: Use that
 ∞
1
F (v) e−λv dv = L(λ) for λ > 0.
0 λ

3) Find the mean E{V }.


(In one-way single-track street cars are driving according to a Poisson process of intensity a; a pedes-
trian needs the time ξ to cross the street; then V indicates the time when he has come safely across
the street).

The assumptions are


(a t)n −at
P {X(t) = n} = e , n ∈ N0 ,
n!
and

P {T1 > t} = P {X(t) = 0} = e−at .

Download free ebooks at bookboon.com

26
Stochastic Processes 2 2. The Poisson process

1) Clearly, F (v) = 0 if v < ξ. If v = ξ, then

F (v) = F (ξ) = P {T1 > ξ} = P {X(ξ) = 0} = e−aξ .

If v > ξ, then τi = v − ξ and v − x ∈ ]v − ξ, c] for x ∈ [0, ξ[, and we are led to the following
computation

F (v) = P {V ≤ v} = P {V = ξ} + P {ξ < V ≤ v} = e−aξ + P {ξ < V ≤ v}


 v
(3) = e−aξ + P {V = x} dP {T > v − x}
x=v−ξ
 0  0
= e−aξ + P {V = v − x} dP {T > x} = e−aξ + F (v − x) de−ax
ξ ξ
 ξ
= e−aξ + F (v − x) a e−ax dx.
0

Here (3) is a generalized sum (i.e. an integral), where V = x and T > v − x, which of course will
contribute to F (v).

2) If L(λ) = 0 f (v) e−λv dv then the Laplace transform of V is
 ∞  ∞
−λv 1 1
F (v) e dv = f (v) e−λv dv = L(λ) for λ > 0.
0 λ 0 λ

When we Laplace transform the result of (2), then


 ∞  ξ 
1 1 −aξ −aλξ
L(λ) = e e + F (v − x) a e−ax dx e−λv dv
λ λ 0 0
 ξ  ∞ 
1 −(a+λ)ξ −λv
= e + F (v − x) e dv a e−ax dx
λ 0 0
 ξ  ∞ 
1 −(a+λ)ξ
= e + F (v − x) e−λv dv a e−ax dx
λ 0 x
 ξ  ∞ 
1 −(a+λ)ξ
= e + F (v) e−λv dv e−λx · a e−ax dx
λ 0 0
 ξ
1 −(a+λ)ξ 1
= e + L(λ) · a e−(λ+a)x dx
λ λ 0
1 −(a+λ)ξ 1 a  
= e + L(λ) · 1 − e−(a+λ)ξ ,
λ λ a+λ

thus
 
−(a+λ)ξ a a λ + a e−(a+λ)ξ
e = L(λ) · 1 − + e−(a+λ)ξ = L(λ) · ,
a+λ a+λ a+λ

and hence
(a + λ)e−(a+λ)ξ
L(λ) = .
λ + a e−(a+λ)ξ

Download free ebooks at bookboon.com

27
Stochastic Processes 2 2. The Poisson process

3) The mean is

E{V } = −L (0)


 
e−(a+λ)ξ − ξ(a + λ)e−(a+λ)ξ (a + λ)e−(a+λ)ξ · 1 − a ξ e−(a+λ)ξ
= lim − 2
λ→0+ λ + a e−(a+λ)ξ λ + a e−(a+λ)ξ
 
e−aξ − ξ a e−aξ a e−aξ 1 − a ξ e−aξ −e−aξ + ξ a e−aξ + 1 − a ξ e−aξ
= − −aξ
− 2 =
ae (a e−aξ ) a e−aξ
1 − e−aξ 1 aξ
= −aξ
= e −1 .
ae a

Example 2.7 To a taxi rank taxis arrive from the south according to a Poisson process of intensity
a, and independently there also arrive taxis from the north according to a Poisson process of intensity
b.
We denote by X the random variable which indicates the number of taxies, which arrive from the
south in the time interval between two succeeding taxi arrivals from the north.
Find P {X = k}, k ∈ N0 , as well as the mean and variance of X.

The length of the time interval between two succeeding arrivals from the north has the frequency

f (t) = b e−bt , t > 0.

When this length is a (fixed) t, then the number of arriving taxies from the south is Poisson distributed
of parameter a t. By the law of total probability,
 ∞ 
(a t)k −at −kt b ak ∞ k −(a+b)t
P {X = k} = e · be dt = t e dt
0 k! k! 0
 k
b ak k! a b
= · k+1
= · , k ∈ N0 ,
k! (a + b) a+b a+b
 
b
so X ∈ N B 1, is negative binomially distributed..
a+b
It follows by some formula in any textbook that

a a a(a + b) a a
E{X} = 1 · = and V {X} = = 1+ .
b b b2 b b

Download free ebooks at bookboon.com

28
Stochastic Processes 2 2. The Poisson process

Example 2.8 The number of car accidents in a given region is assumed to follow a Poisson process
{X(t), t ∈ [0, ∞[} of intensity λ, and the number of persons involved in the i-th accident is a random
variableYi , which is geometrically distributed,
P {Yi = k} = p q k−1 , k ∈ N,
where p > 0, q > 0 and p + q = 1. We assume that the Yi are mutually independent, and independent
of {X(t), t ≥ 0}.
1. Find the generating function of X(t).
2. Find the generating function of Yi .
Denote by Z(t) the total number of persons involved in accidents in the time interval ]0, t].
3. Describe the generating function of Z(t) expressed by the generating function of Y i and the gener-
ating function of X(t).
Hint: Use that
∞
P {Z(t) = k} = P {X(t) = i ∧ Y1 + Y2 + · · · + Yi = k} .
i=0

4. Compute E{Z(t)} and V {Z(t)}.

1) Since X(t) is a Poisson process, we have


(λ t)k −λt
P {X(t) = k} = e , k ∈ N0 .
k!
We find its generating function by using a table,
PX(t) (s) = exp(λt(s − 1)).

2) Also, by using a table, the generating function of Yi is


p(s)
PYi (s) = .
1−qs
The Yi are mutually independent, so the generating function of Y1 + · · · + Yi is given by
 
ps
, i ∈ N.
1 − qs

3) The generating function of Z(t) is


∞ ∞
∞ 
  
k
PZ(t) (s) = P {Z(t) = k} s = P {X(t) = i ∧ Y1 + · · · + Yi = k} sk
k=0 k=0 i=0

 ∞

 
= P {X(t) = i} P {Y1 + · · · + Yi = k) sk
i=1 k=0
∞      
ps ps ps
= P {X(t) = i} = PX(t) = exp λt −1
i=1
1 − qs 1 − qs 1 − qs
     
s−1 p 1
= exp λ t = exp λt · −1 .
1 − qs q 1 − qs

Download free ebooks at bookboon.com

29
Stochastic Processes 2 2. The Poisson process

4) It follows from

 p  λt
PZ(t) (s) = λt · PZ(t) (s) med PZ(t) (1) = ,
(1 − qs)2 p

and
 2
 p 2pq
PZ(t) (s) = λt · PZ(t) (s) + λt · PZ(t) (s),
(1 − qs)2 (1 − qs)3

where
 2
 λt 2q
PZ(t) (1) = + λt · ,
p p2

that

 λt
E{Z(t)} = PZ(t) (1) =
p
and
 2  2
   2 λt 2q λt λt
V {Z(t)} = P (1) + P (1) − (P (t)) = + λt · 2 + −
p p p p
2q + p 1+q
= λt · = λt · 2 .
p2 p

Brain power By 2020, wind could provide one-tenth of our planet’s


electricity needs. Already today, SKF’s innovative know-
how is crucial to running a large proportion of the
world’s wind turbines.
Up to 25 % of the generating costs relate to mainte-
nance. These can be reduced dramatically thanks to our
systems for on-line condition monitoring and automatic
lubrication. We help make it more economical to create
Please click the advert

cleaner, cheaper energy out of thin air.


By sharing our experience, expertise, and creativity,
industries can boost performance beyond expectations.
Therefore we need the best employees who can
meet this challenge!

The Power of Knowledge Engineering

Plug into The Power of Knowledge Engineering.


Visit us at www.skf.com/knowledge

Download free ebooks at bookboon.com

30
Stochastic Processes 2 2. The Poisson process

Example 2.9 (Continuation of Example 2.8).


Assume that the number of car accidents in a city follows a Poisson process {X(t), t ∈ [0, ∞[} of
intensity 2 per day. The number of persons involved in one accident is assumed to be geometrically
distributed with p = 12 .
Find the mean and variance of the number of persons involved in car accidents in the city per week.

It follows from Example 2.8 that


λt 1+q
E{Z(t)} = and V {Z(t)} = λt · .
p p2

In the specific case the intensity is λ = 2, and the time span is t = 7 days. Furthermore, p = q = 12 ,
thus
2·7
E{Z(7)} = 1 = 28
2

and
1+ 1
V {Z(7)} = 2 · 7 · 22 = 2 · 7 · 6 = 84.
1
2

Example 2.10 Given a service to which customers arrive according to a Poisson process of intensity
λ (measured in the unit minut−1 ).
Denote by I1 , I2 and I3 three succeeding time intervals, each of the length of 1 minute.
1. Find the probability that there is no customer in any of the three intervals.
2. Find the probability that there is precisely one arrival of a customer in one of these intervals and
none in the other two.
3. Find the probability that there are in total three arrivals in the time intervals I 1 , I2 and I3 , where
precisely two of them occur in one of these intervals.
4. Find the value of λ, for which the probability found in 3. is largest.
Then consider 12 succeeding time intervals, each of length 1 minute. Let the random variable Z denote
the number of intervals, in which we have no arrival.
5. Find the distribution of Z.
6. For λ = 1 find the probability P {Z = 4} (2 dec.).

1) Let

I1 = ]0, 1], I2 = ]1, 2], I3 = ]2, 3].

Then
3
P {no event in I1 ∪ I2 ∪ I3 = ]0, 3]} = e−λ = e−3λ .

Download free ebooks at bookboon.com

31
Stochastic Processes 2 2. The Poisson process

2) By a rearrangement,

P {one event in one interval, none in the other two} = P {one event in ]0, 3]} = 3λ e −3λ .

3) We have

P {two events in one interval, one in another one, and none in the remaining one}
= P {two events in one interval, one in the remaining two intervals}
λ2 −λ
=3· · e · 2λ e−2λ = 3λ3 e−3λ .
2

4) We conclude from 3. that g(λ) = 3λ3 e−3λ > 0 for λ > 0 with g(λ) → 0 for λ → 0+, and for
λ → ∞. By a differentiation,

g  (λ) = 9λ2 − 9λ3 e−3λ = 9λ2 (1 − λ)e−3λ = 0 for λ = 1 > 0,

thus the probability is largest for λ = 1 med g(1) = 3 e−3 .


Please click the advert

The financial industry needs a strong software platform


That’s why we need you
SimCorp is a leading provider of software solutions for the financial industry. We work together to reach a common goal: to help our clients
succeed by providing a strong, scalable IT platform that enables growth, while mitigating risk and reducing cost. At SimCorp, we value
commitment and enable you to make the most of your ambitions and potential.
Find your next challenge at
Are you among the best qualified in finance, economics, IT or mathematics? www.simcorp.com/careers

www.simcorp.com

MITIGATE RISK REDUCE COST ENABLE GROWTH

Download free ebooks at bookboon.com

32
Stochastic Processes 2 2. The Poisson process

5) Assume now that we have 12 intervals. From

P {no arrival in an interval} = e−λ ,

we get
 
12 12−k
P {Z = k} = e−λk 1 − e−λ , k = 0, 1, 2, . . . , 12,
k

thus Z ∈ B 12, e−λ .
6) By insertion of λ = 1 an k = 4 into the result of 5. we get
 
12  −1 2 4 4
P {Z = 4} = e 1 − e−1 = 495 · 0.3679 · 0.63212 = 0.2313 ≈ 0.23.
4

Example 2.11 A random variable X is Poisson distributed with parameter a.


1. Compute the characteristic function of X.
2. Prove for large values of a that X is approximately normally distributed of mean a and variance a
(more precisely,
 
X −a
lim P √ ≤ x = Φ(x) for all x ∈ R).
n→∞ a

To a service customers arrive according to a Poisson process of intensity λ = 1 minut −1 . Denote by


X the number of customers who arrive in a time interval of length 100 minutes.
3. Apply Chebyshev’s inequality to find an lower bound of

(4) P {80 < X < 120}.

4. Find an approximate expression of (4) by using the result of 2..

1) We get from

ak −a
P {X = k} = e , k ∈ N0 ,
k!
the characteristic function

 ∞

ak −a 1 iω k
kX (ω) = eiωk · e = e−a e a = e−a · exp a · eiω = exp a eiω − 1 .
k! k!
k=0 k=0

2) Put

X −a
Xa = √ .
a

Download free ebooks at bookboon.com

33
Stochastic Processes 2 2. The Poisson process

Then the characteristic function of Xa is given by


∞   ∞   k
k − a ak −a √
−iω a −a 1 ω
kXa (ω) = exp iω · √ e =e ·e a · exp i √
a k! k! a
k=0 k=0
        

−iω a −a ω iω √
= e · e exp a · exp i √ = exp a exp √ − 1 − iω a .
a a
It follows from
       
iω √ iω 1 ω2 1 1 √
a exp √ − 1 − iω a = a 1 + √ − + ε − 1 − iω a
a a 2! a a a
 
1 1 1
= − ω2 + ε → − ω2 for a → ∞,
2 a 2
that
 
1
k(ω) = lim kXa (ω) = exp − ω 2 ,
a→∞ 2
hence k(ω) is the characteristic function of a normally distributed random variable from N (0, 1).
It follows that {Xa } for a → ∞ converges in distribution towards the normal distribution N (0, 1),
thus
 
X −a
lim P √ ≤ x = Φ(x) for every x ∈ R.
a→∞ a

3) If t = 100 and λ = 1 minut−1 , then


100n −100
P {X = n} = e , n ∈ N0 ,
n!
hence a = 100 and σ 2 = 100. Then by Chebyshev’s inequality
100 1
P {|X − 100| ≥ 20} ≤ 2
= ,
20 4
so
1 3
P {80 < X < 120} = 1 − P {|X − 100| ≥ 20} ≥ 1 − = .
4 4
4) An approximate expression of
! ! 
! X − 100 !
!
P {80 < X < 120} = P {|X − 100| < 20} = P ! !<2
10 !
is then by 2. given by
Φ(2) − Φ(−2) = 2Φ(2) − 1 ≈ 2 · 0.9772 − 1 = 0.9544.
However, since X is an integer, we must here use the correction of continuity. Then the interval
should be 80.5 < x < 119.5. We get the improved approximate expression,
! ! 
! X − 100 !
P {80.5 < X < 119.5} = P {|X − 100| < 19.5} = P ! ! ! < 1.95
10 !
= Φ(1.95) − Φ(1.95) = 2Φ(1.95) − 1
≈ 2 · 0.9744 − 1 = 0, 9488.

Download free ebooks at bookboon.com

34
Stochastic Processes 2 2. The Poisson process

Remark 2.1 For comparison a long and tedious computation on a pocket calculator gives
P {80 < X < 120} ≈ 0.9491. ♦

Example 2.12 In a shop there are two shop assistants A and B. Customers may freely choose if they
will queue up at A or at B, but they cannot change their decision afterwards. For all customers at A
their serving times are mutually independent random variables of the frequency

⎨ λ e−λx , x > 0,
f (x) = (λ is a positive constant),

0, x ≤ 0,
and for the customers at B the serving times are mutually independent random variables of frequency

⎨ 2λ e−2λy , y > 0,
g(y) =

0, y ≤ 0.
At a given time Andrew arrives and is queueing up at A, where there in front of him is only one
customer, and where the service of this customer has just begun. We call the serving time of this
customer X1 , while Andrew’s serving time is called X2 .
At the same time Basil arrives and joins the queue at B, where there in front of him are two waiting
customers, and where the service of the first customer has just begun. The service times of these two
customers are denoted Y1 and Y2 , resp..
1. Find the frequencies of the random variables X1 + X2 and Y1 + Y2 .
2. Express by means of the random variables Y1 , Y2 and X1 the event that the service of Basil starts
after the time when the service of Andrew has started, and find the probability of this event.
3. Find the probability that the service of Basil starts after the end of the service of Andrew.
Assume that the customers arrive to the shop according to a Poisson process of intensity α.
4. Find the expected number of customers, who arrive to the shop in a time interval of length t.
5. Let N denote the random variable, which indicates the number of customers who arrive to the shop
during the time when Andrew is in the shop (thus X1 + X2 ). Find the mean of N .

   
1 1
1) Since Xi ∈ Γ 1, is exponentially distributed we have X1 + X2 ∈ Γ 2, , thus
λ λ
⎧ 2 −λx
⎨ λ xe , x ≥ 0,
fX1 +X2 (x) =

0, x < 0,
   
1 1
Since Yi ∈ Γ 1, , we have Y1 + Y2 ∈ Γ 2, with the frequency
2λ 2λ

⎨ 4λ2 y e−2λy , y ≥ 0,
gY1 +Y2 (y) =

0, y < 0.

Download free ebooks at bookboon.com

35
Stochastic Processes 2 2. The Poisson process

2) The event is expressed by X1 < Y1 + Y2 . The probability of this event is


 
P {X1 < Y1 + Y2 } = λ e−λx · 4λ2 y e−2λy dx dy
{0<x<y}
 ∞  y 
2 −2λy
= 4λ y e λ e−λx dx dy
0 ∞ 0
 −λx y2 −2λy
= 4λ y e −e x=0
dy
0
 ∞  ∞
= 4λ2 y e−2λy dy − 4λ2 y e−3λy dy
0 0
 ∞ 
−t 4 ∞ −t 5
= t e dt − t e dt = .
0 9 0 9

3) We must have in this case that X1 + X2 < Y1 + Y2 . Hence the probability is


 
P {X1 + X2 < Y1 + Y2 } = λ2 x e−λx · 4λ2 y e−2λy dx dy
{0<x<y}
 ∞  y   ∞   y 
2 −2λy 2 −λx 2 −2λy
 y
= 4λ y e dx dy = λ xe 4λ y e −λx e−λx 0 + λe −λx
dx dy
0 ∞  y
0
 ∞ 0 0

= 4λ2 y e−2λy λ e−λx dx − 4λ3 y 2 e−3λy dy


0 0 0
 ∞
4 5 4 15 − 8 7
= P {X1 < Y1 + Y2 } − (3λ)3 y 2 e−3λy dy = − ·2= = .
27 0 9 27 27 27

4) If X(t) indicates the number of arrived customers in ]0, t], then

(αt)n −αt
P {X(t) = n} = e , n ∈ N0 ,
n!
and

 (αt)n −αt
m(t) = E{X(t)} = n e = α t.
n=0
n!

5) Finally, (cf. 4.),


 
1 1 2α
E{N } = α E {X1 + X2 } = α + = .
λ λ λ

Download free ebooks at bookboon.com

36
Stochastic Processes 2 3. Birth and death processes

3 Birth and death processes


Example 3.1 Consider a birth process {X(t), t ∈ [0, ∞[} of states E0 , E1 , E2 , . . . and positive birth
intensities λk . The differential equations of the process are
⎧ 
⎨ P0 (t) = −λ0 P0 (t),

Pk (t) = −λk Pk (t) + λk−1 Pk−1 (t), k ∈ N,

and we assume that the process at t = 0 is in state E0 . It can be proved that the differential equations
have a uniquely determined solution (Pk (t)) satisfying


Pk (t) ≥ 0, Pk (t) ≤ 1.
k=0



One can also prove that either k=0 Pk (t) = 1 for all t > 0, or k=0 Pk (t) < 1 for all t > 0.
Prove that


∞ 1
k=0 Pk (t) = 1 for all t > 0, if and only if k=0 is divergent.
λk
Hint: First prove that
 t
1 1
a(t) ≤ Pk (s) ds ≤ , k ∈ N0 , t > 0,
λk 0 λ k


where a(t) = 1 − k=0 Pk (t).

We get by a rearrangement and recursion,


k

λk Pk (t) = −Pk (t) + λk−1 Pk−1 (t) = −Pk (t) − Pk−1

(t) + λk−2 Pk−2 (t) = · · · = − Pj (t),
j=0

hence by integration,
⎡ ⎤t
 t k
 k
 k

λk Pk (s) ds = ⎣− Pj (s)⎦ = Pj (0)Pj (t) = 1 − Pj (t),
0 j=0 j=0 j=0
0

because at time t = 0 we are in state E0 , so P0 (0) = 0, and Pj (0) = 0, j ∈ N.


Thus we have the estimates

 k
  t
a(t) = 1 − Pj (t) ≤ 1 − Pj (t) = λk Pk (s) ds ≤ 1,
j=0 j=0 0

from which
 t
1 1
a(t) ≤ Pk (s) ds ≤ .
λk 0 λk

Download free ebooks at bookboon.com

37
Stochastic Processes 2 3. Birth and death processes



Assume that k=0 Pk (t) = 1. Applying the theorem of monotonous convergence (NB The Lebesgue
integral!) it follows from the right hand inequality that

 ∞  t
  t
∞  t
1
≥ Pk (s) ds = Pk (s) ds = 1 dt = t for alle t ∈ R+ ,
λl 0 0 0
k=0 k=0 k=0


∞ 1
proving that the series k=0 is divergent.
λk


Then assume that k=0 Pk (t) < 1, thus


a(t) = 1 − Pk (t) > 0.
k=0

Using the theorem of monotonous convergence and the left hand inequality we get
  ∞  t
 1 
· a(t) ≤ Pk (s) ds ≤ t for all t ∈ R+ .
λk 0
k=0 k=0

Now a(t) > 0, so this implies that



 1 t
≤ < ∞,
λk a(t)
k=0


∞ 1
and the series k=0 is convergent.
λk
Please click the advert

Download free ebooks at bookboon.com

38
Stochastic Processes 2 3. Birth and death processes

Example 3.2 To a carpark, cars arrive from 900 (t = 0) following a Poisson process of intensity λ.
There are in total N parking bays, and we assume that no car leaves the carpark. Let E n , n = 0, 1,
. . . , N , denote the state that n of the parking bays are occupied.

1) Find the differential equations of the system.


2) Find Pn (t), n = 0, 1, . . . , N .
3) Find the stationary probabilities pn , n = 0, 1, . . . , N .

Put λ = 1 minute−1 and N = 5. Find the probability that a car driver who arrives at 903 cannot find
a vacant parking bay.

1) This is a pure birth process with



⎨ λ for n = 0, 1, . . . , N − 1,
λn =

0 for n = N,

and the system of differential equations

P0 (t) = −λ P0 (t),


Pn (t) = −λ Pn (t) + λ Pn−1 (t), n = 1, 2, . . . , N − 1,
PN (t) = λ PN −1 (t),

and initial conditions



⎨ 1 for n = 0,
Pn (0) =

0 for n > 0.

2) The system of 1. can either be solved successively or by consulting a textbook,




⎪ (λt)n

⎨ e−λt , n = 0, 1, 2, . . . , N − 1,
n!
Pn (t) =


n

⎩ 1 − N −1 (λt) e−λt ,
n=0 n = n.
n!

3) It follows immediately that



0, n < N,
Pn (t) → for t → ∞,
1, n = N,

thus

⎨ 0, n < N,
pn =

1, n = n.

Download free ebooks at bookboon.com

39
Stochastic Processes 2 3. Birth and death processes

4) First identify

λ = 1 minute−1 , t = 3 and N = 5.

Then by insertion,

4 4
  3n −3
P no parking bay at 903 = P5 (3) = 1 − Pn (3) = 1 − e = 0.1847 ≈ 0.185.
n=0 n=0
n!

Example 3.3 Given a stochastic birth and death process X(t), t ∈ [0, ∞[}, which can be in the states
E4 , E5 , E6 and E7 .
Assume that the birth intensity λk is in state Ek given by

λk = αk(7 − k),

and that the death intensity μk in state Ek is equal to

μk = βk(k − 4),

where α and β are positive constants.


Find the stationary probabilities in each of the two cases below
1) β = α,
2) β = 2α.

The equations of equilibrium are here

μk+1 pk+1 = λk pk for k = 4, 5, 6.

Thus
 1
λ4 12α 12 α
p5 = p4 = p4 = p4 ,
μ5 5β 5 β
 2
λ5 10α 12 α α
p6 = p5 = · =2 p4 ,
μ6 12β 5 β β
 2  3
λ6 6α α 4 α
p7 = p6 = ·2 = p4 .
μ7 21β β 7 β
Furthermore,

p4 + p5 + p6 + p7 = 1.

However, the exact values can first be found when we know the relationship between α and β.
1) If β = α, then
 
12 4 35 + 84 + 70 + 20 209
1 = p4 1 + +2+ = p4 = p4 ,
5 7 35 35

Download free ebooks at bookboon.com

40
Stochastic Processes 2 3. Birth and death processes

hence
35 12 35 84
p4 = , p5 = · = ,
209 5 209 209
70 4 35 20
p6 = , p7 = · = ,
209 7 209 2+9
so
1
p = (p4 , p5 , p6 , p7 ) = (35, 84, 70, 20).
209

α 1
2) If β = 2α, then = , hence
β 2
6 1 1
p5 = p4 , p6 = p4 , p7 = p4 ,
5 2 14
and
 
6 1 1 70 + 84 + 35 + 5 97
1 = p 4 + p5 + p6 + p7 = p4 1 + + + = p4 = p4 ,
5 2 14 70 35

from which
35 42 35 5
p4 = , p5 = , p6 = , p7 = ,
97 97 194 194
i.e.
1
p = (p4 , p5 , p6 , p7 ) = (70, 84, 35, 5).
194

Try this...
Please click the advert

Challenging? Not challenging? Try more www.alloptions.nl/life


Download free ebooks at bookboon.com

41
Stochastic Processes 2 3. Birth and death processes

Example 3.4 Given a birth and death process of the states E0 , E1 , E2 , . . . , birth intensities λk and
death intensities μk . Assume furthermore that
a. λk = μk = k α, k ∈ N0 , (where a is a positive constant).

b. P1 (0) = 1.

1. Find the differential equations of the process.

One may now without proof use that under the assumptions above,
1
P1 (t) = .
(1 + αt)2

2. Find P0 (t), P2 (t) and P3 (t).

3. Sketch the graph of P0 (t) + P1 (t).

4. Sketch the graph of P2 (t).

5. Find limt→∞ Pn (t) for every n ∈ N0 .

1) We have

P0 (t) = −λ0 P0 (t) + μ1 P1 (t) = α P1 (t),

and

Pk (t) = − (λk + μk ) Pk (t) + λk−1 Pk−1 (t) + μk+1 Pk+1 (t)
= (k − 1)αPk−1 (t) − 2kαPk (t) + (k + 1)αPk+1 (t) for k ∈ N.

2) If P1 (0) = 1, then Pk (0) = 0 for k ∈ N0 \ {1}. It follows from


α
P0 (t) = α P1 (t) = ,
(1 + αt)2

by an integration that
 t  t
α dτ 1 1 αt
P0 (t) = 2
= − =1− = .
0 (1 + ατ ) 1 + ατ 0 1 + αt 1 + αt

If k = 1, we get by a rearrangement,
 
1 1 2α 2α
P2 (t) = {P1 (t) − 0 · P0 (t) + 2α P1 (t)} = − +
2α 2α (1 + αt)3 (1 + αt)2
1 1 αt
= 2
− 3
= .
(1 + αt) (1 + αt) (1 + αt)3

Download free ebooks at bookboon.com

42
Stochastic Processes 2 3. Birth and death processes

If k = 2, we get by a rearrangement,

1
P3 (t) = {P  (t) − α P1 (t) + 4α P2 (t)}
3α  2 
1 3α 2α α 4α 4α
= − − + −
3α (1+αt)4 (1+αt)3 (1+αt)2 (1+αt)2 (1+αt)3
 
1 3α 6α 3α
= − +
3α (1 + αt)4 (1 + αt)3 (1 + αt)2
(1 + αt)2 − 2(1 + αt) + 1 α 2 t2
= 4
= .
(1 + αt) (1 + αt)4

Summing up,
αt 1
P0 (t) = , P1 (t) = ,
1 + αt (1 + αt)2
2 2
αt α t
P2 (t) = , P3 (t) = .
(1 + αt)3 (1 + αt)4

0.8

0.6

0.4

0.2

0 0.5 1 1.5 2 2.5 3

x
Figure 1: The graph of 1 − with x = αt.
(1 + x)2

3) It follows that

αt 1 1 + αt + α2 t2 αt
P0 (t) + P1 (t) = + 2
= =1− .
1 + αt (1 + αt) (1 + αt)2 (1 + αt)2
If we put x = αt, we see that we shall only sketch
x 1 1
1− =1− + ,
(1 + x)2 1 + x (1 + x)2
which has a minimum for x = 1, and has y = 1 as an asymptote.

Download free ebooks at bookboon.com

43
Stochastic Processes 2 3. Birth and death processes

0.8

0.6

0.4

0.2

0 0.5 1 1.5 2 2.5 3

x
Figure 2: The graph of with x = αt.
(1 + x)3

4) If we put x = αt, it follows that we shall only sketch


x
ϕ(x) = .
(1 + x)3
From
1 3x 1 − 2x
ϕ (x) = − = ,
(1 + x)3 (1 + x)4 (1 + x)4
1
follows that we have a maximum for x = , corresponding to
2
  1
1 4
ϕ = 2 3 = .
2 3 27
2

5) Clearly,
αt
lim P0 (t) = lim = 1.
t→∞ t→∞ 1 + αt

We conclude from
∞
Pn (t) = 1 and Pn (t) ≥ 0,
n=0

that


lim Pn (t) = 0,
t→∞
n=1

hence
lim Pn (t) = 0 for alle n ∈ N.
t→∞

Download free ebooks at bookboon.com

44
Stochastic Processes 2 3. Birth and death processes

Example 3.5 A power station delivers electricity to N customers. If a customer at time t uses
electricity there is the probability μh + h ε(h) that he does not use electricity at time t + h, and
probability 1 − μh + h ε(h) that he is still using electricity at time t + h.
However, if he to time t does not use electricity, then there is the probability λh + h ε(h) that he uses
electricity at time t + h, and probability 1 − λh + h ε(h) that he does not do it.
The customers are using electricity mutually independently.
Denote by Ek the state that k consumers use electricity, k = 0, 1, . . . , N .
Find the differential equations of the system.
Find the stationary probabilities.

We put Xk (t) = 1, if the k-th customer uses electricity at time t, and Xk (t) = 0, if he does not do it.
Let n and j ∈ {0, 1, . . . , N }, and assume that the system is in state E j , i.e.
N

Xk (t) = j at time t.
k=1

How can we realize that we are in state En at time t + h?

There must be an m ∈ {0, 1, . . . , j}, such that j − m of the customers who were using electricity at
time t, still are using electricity at time t + h.
Furthermore, n − j + m of the customers, who did not use electricity at time t, must use electricity
at time t + h, is we are in state En .
Thus we get the condition m ≥ j − n, so

m ∈ {max{0, j − n}, . . . , min{j, N − n}}, and j ∈ {0, 1, . . . , N }.

Fast-track
your career
Please click the advert

Masters in Management Stand out from the crowd


Designed for graduates with less than one year of full-time postgraduate work
experience, London Business School’s Masters in Management will expand your
thinking and provide you with the foundations for a successful career in business.
The programme is developed in consultation with recruiters to provide you with
the key skills that top employers demand. Through 11 months of full-time study,
you will gain the business knowledge and capabilities to increase your career
choices and stand out from the crowd.
London Business School Applications are now open for entry in September 2011.
Regent’s Park
London NW1 4SA
United Kingdom
Tel +44 (0)20 7000 7573
For more information visit www.london.edu/mim/
Email [email protected] email [email protected] or call +44 (0)20 7000 7573
www.london.edu/mim/

Download free ebooks at bookboon.com

45
Stochastic Processes 2 3. Birth and death processes

Summing up, if the conditions above are fulfilled, then


1) m of the customers, who used electricity at time t, do not do it at time t + h.
2) j − m use electricity both at time t and at time t + h.
3) n − j + m did not use electricity at time t, but they do it at time t + h.
4) N − n − m neither use electricity at time t nor at time t + h.
For fixed j this can be done of the probability
min{j,N −n}  
 j
{μh + hε(h)}m {1 − μh + hε(h)}j−m
m
m=max{0,j−n}
 
N −j
{λh + hε(h)}n−j+m {1 − λh + hε(h)}N −n−m .
n−j+m
When we multiply this equation by Pj (t) and then sum with respect to j, we get
N
 min{j,N −n}
   
j N −j
(5) Pn (t + h) = Pj (t) ×
m n−j+m
j=0 m=max{0,j−n}

×{μh + hε(h)}m {1 − μh + hε(h)}j−m ×


×{λh + hε(h)}n−j+m {1 − λh + hε(h)}N −m−n .
If m = 0 in the inner sum, then j ≤ n, and we isolate the term
  
j N −j
{μh + hε(h)}0 {1 − μh + hε(h)}j {λh + hε(h)}n−j {1 − λh + hε(h)}N −n
0 n−j
 
N −j
= {1 − μh + hε(h)}j {1 − λh + hε(h)}N −n hn−j {λ + ε(h)}n−j .
n−j
It follows clearly that if j = n, n − 1, then we get terms of the type hε(h),
If furthermore j = n, then we get the term
 
N −n
{1 − μh + hε(h)}n {1 − λh + hε(h)}N −n · 1
0
= (1 − μh)n (1 − λh)N −n + hε(h) = 1 − nμh + (N − n)λh + h ε(h).

If instead j = n − 1, then we get the term


 
N −n+1
{1 − μh + hε(h)}n−1 {1 − λh + hε(h)}N −1 · h · (λ + hε(h))
1
= (N − n + 1)hλ + hε(h).
If m = 1 in the inner sum of (5), then
j − n ≤ n ≤ min{j, N − n},
thus 1 ≤ j ≤ n + 1. For such j we get the contribution
  
j N −j
μh(1 − μh)j−1 (λh)n−j+1 (1 − λh)N −n−m + hε(h).
1 n−j+1

Download free ebooks at bookboon.com

46
Stochastic Processes 2 3. Birth and death processes

It follows immediately that if j = n + 1, then all these terms are of the type hε(h).
For j = n + 1 we get the contribution
  
n+1 N −n−1
μh(1 − μh)n (1 − λh)N −n−m + hε(h) = (n + 1)μh + hε(h).
1 0

If m ≥ 2, we only get terms of the type hε(h).

We now include ε functions. Then (5) is reduced by this analysis for n = 1, . . . , N − 1, to


Pn (t + h) = Pn {1 − nμh − (N − n)λh + hε(h)} + Pn−1 (t) · (N − n + 1)hλ + hε(h)
+Pn+1 (t) · (n + 1) · μh + hε(h),
thus by a rearrangement
Pn (t + h) − Pn (t)
= −h {(nμ + (N − n)λ)Pn (t)} + h(N − n + 1)λPn−1 (t) + h(n + 1)μPn+1 (t) + hε(h),
and hence dividing by h, followed by taking the limit h → 0,

Pn (t) = −{nμ + (N − n)λ}Pn (t) + (N − n + 1)λPn−1 (t) + (n + 1)μPn+1 (t).

There are some modifications for n = 0 and n = N , in which cases we get instead

P0 (t) = −N λ P0 (t) + μ P1 (t),

and

PN (t) = −N μ PN (t) + λ PN −1 (t).

Then we have for the stationary probabilities,


0 = −N λ p0 + μ p1 ,
0 = −{nμ + (N − n)λ}pn + (N − n + 1)λpn−1 + (n + 1)μpn+1 , n = 1, . . . , N − 1,
0 = −N μ pN + λ pN −1 ,
hence

⎪ λ

⎪ p 1 = N · p0

⎪ μ



⎪  
⎨ n N −n λ N −n+1 λ
pn+1 = + · pn − · pn−1 n = 1, . . . , N − 1,

⎪ n + 1 n + 1 μ n+1 μ







⎩ pN =
1 λ
· pN −1 .
N μ
In order to find the pattern we compute p2 , i.e. we put n = 1 into the general formula
     2
1 N −1 λ N λ N λ N (N − 1) λ N λ
p2 = + · p1 − p0 = · · p0 + p0 − · · p0
2 2 μ 2 μ 2 μ 2 μ 2 μ
   2
N λ
= · p0 .
2 μ

Download free ebooks at bookboon.com

47
Stochastic Processes 2 3. Birth and death processes

Now
 1    1
λ N λ
p1 = N · p0 = · p0 ,
μ 1 μ
so we guess that we in general have
   m
N λ
pn = · p0 .
n μ
This is true for n = 0, 1, 2.
Assume that the claim holds for all indices up to n. If n ≤ N − 1, then
 
n N −n λ N −n+1 λ
pn+1 = + · pn − · pn−1
n+1 n+1 μ n+1 μ
 n  n+1
n N! λ N −n N! λ
= · + · p0
n + 1 n!(N − n)! μ n + 1 n!(N − n)! μ
 n
N −n+1 N! λ
− · p0
n+1 (n − 1)!(N − n + 1)! μ
 n  n
N! λ N! λ
= p0 − p0
(n + 1) · (n − 1)!(N − n)! μ (n + 1)(n − 1)!(N − n)! μ
 n+1
N! λ
+ p0
(n + 1)!(N − n − 1)! μ
   n+1
N λ
= p0 ,
n+1 μ
and the claim follows by induction. Then
N
 N    n  N  N
N λ λ λ+μ
1= pn = p 0 = p0 · 1 + = p0 ,
n μ μ μ
n=0 n=0

hence
   n  N   n  N −n
N λ μ N λ μ
pn = · · = .
n μ λ+μ n λ+μ λ+μ
The solution above is somewhat clumsy, though it follows the ordinary way one would solve problems
of this type without too much training.

Alternatively we see that we have a birth and death process of states E 0 , E1 , . . . , EN , and
intensities

λk = (N − k)λ, μk = kμ, k ∈ {0, 1, . . . , N }.

The corresponding system of differential equations becomes


⎧ 

⎪ P0 (t) = −N λP0 (t) + μP1 (t),



Pk (t) = −{(N −k)λ+kμ}Pk (t)+(N −k+1)λPk−1 (t)+(k+1)μPk+1 (t),



⎪ for 1 ≤ k ≤ N − 1,
⎩ 
PN (t) = −N μPN (t) + λPN −1 (t).

Download free ebooks at bookboon.com

48
Stochastic Processes 2 3. Birth and death processes

The stationary probabilities pk are found from

μk pk = λk−1 pk−1 , k = 1, 2, . . . , N,

thus
N −k+1 λ
pk = · · pk−1 .
k μ
Then by recursion,
 k  k    k
(N − k + 1)(N − k + 2) · N λ N! λ N λ
pk = · p0 = p0 = p0 .
k · (k − 1) · 1 μ k!(N − k)! μ k μ

Finally, it follows from


N
 N 
   k  N  N
N λ λ λ+μ
1= pk = p 0 = p0 +1 = p0
k μ μ μ
k=0 k=0

that
   k    k  N    k  N −k
N λ λ μ N λ μ
pk = p0 = · = · ,
k μ k μ λ+μ k λ+μ λ+μ
 
λ λ
for k = 0, 1, 2, . . . , N , so we get a binomial distribution B N, of mean N · .
λ+μ λ+μ

© UBS 2010. All rights reserved.


You’re full of energy
and ideas. And that’s
just what we are looking for.
Please click the advert

Looking for a career where your ideas could really make a difference? UBS’s
Graduate Programme and internships are a chance for you to experience
for yourself what it’s like to be part of a global team that rewards your input
and believes in succeeding together.

Wherever you are in your academic career, make your future a part of ours
by visiting www.ubs.com/graduates.

www.ubs.com/graduates

Download free ebooks at bookboon.com

49
Stochastic Processes 2 3. Birth and death processes

Example 3.6 Given a stochastic process {X(t), t ∈ [0, ∞[} by the following: At time t = 0 there are
N cars in a carpark. No car arrives, and the cars leave the carpark mutually independently. If a car
is staying at its parking bay at time t, then there is the probability μh + hε(h) [where μ is a positive
constant] that it leaves the carpark in the time interval ]t, t + h].
Put X(t) = k, k = 0, 1, . . . , N , if there are k cars in the carpark at time t, and put

Pk (t) = P {X(t) = k}.

1. Prove that we have a death process with μk = kμ, k = 0, 1, . . . , N .


2. Find the differential equations of the system.
3. Find the stationary probabilities.
4. Prove that the mean value function
N

m(t) = k Pk (t)
k=1

is a solution of the differential equation


dx
+ μx = 0,
dt
and then find m(t).
5. Given that X(t) is binomially distributed, find the probabilities P k (t), k = 0, 1, . . . , N .
We introduce a random variable T by putting T = t, if the last car leaves the carpark at time t.
6. Find the distribution function and the frequency of T .

1) This follows e.g. from the fact that the probability that one of the k cars leaves the carpark in the
time interval ]t, t + h] is

k{μh + hε(h)} · {1 − μh + hε(h)}k−1 = kμh + hε(h),

from which we conclude that μk = kμ.


2) The differential equations are immediately found to be
⎧ 
⎨ Pk (t) = −kμPk (t) + (k + 1)μPk+1 (t), 0 ≤ k ≤ N − 1,

PN (t) = −N μPN (t).

3) The stationary probabilities become

k pk = 0, k = 0, 1, . . . , N.

N
Since k=0 pk = 1, we get

pk = 0 for k = 1, 2, . . . , N and p0 = 1.

This result is of course obvious, because the carpark at last is empty.

Download free ebooks at bookboon.com

50
Stochastic Processes 2 3. Birth and death processes

4) If we multiply the k-th equation of 2. by k, and then sum from 1 to N , we get

N
 N
 N
 −1
k Pk (t) = −μ k 2 Pk (t) + μ k(k + 1)Pk+1 (t)
k=1 k=1 k=1
N
 N
 N

= −μ k 2 Pk (t) + μ (k − 1 = jPk (t) = −μ k Pk (t),
k=1 k=1 k=1

which is also written


N


m (t) + μ m(t) = 0, m(t) = k Pk (t).
k=1

From m(0) = N follows that m(t) = N e−μt .


5) Since X(t) is binomially distributed of parameter of numbers N , and since we also know the mean,
we can find the probability parameter, thus

X(t) ∈ B N, e−μt ,

and
 
N N −k
Pk (t) = e−kμt 1 − e−μt , k = 0, 1, . . . , N.
k

6) Now, T ≤ t, if and only if X(t) = 0. Hence


⎧ N
⎨ P0 (t) = (1 − e−μt ) , for t ≥ 0,
F (t) =

0 for t < 0,

and finally by differentiation


⎧ N −1
⎨ N (1 − e−μt ) μ e−μt , for t ≥ 0,
f (t) =

0, for t < 0.

Download free ebooks at bookboon.com

51
Stochastic Processes 2 4. Queueing theory

4 Queueing theory
Example 4.1 Customers arrive to a shop by a Poisson process of intensity λ. There are 2 shop
assistants and possibility of forming a queue. We assume that the service times are exponentially
distributed of parameter μ.
1
It is given that there are no customers in the shop in at the average 10 % of the time and that = 11.
λ
1
Find .
μ
Then find the probability that both shop assistants are busy.

1 1
Here, N = 2 and p0 = and = 11. In fact, it was given that P0 (t) → p0 = 10 % for t → ∞.
10 λ
The traffic intensity  is for N = 2 given by
1− 1 9
p0 = = , hvoraf  = .
1+ 10 11
On the other hand, the traffic intensity is defined by
λ λ 1 9 1
= = = = , dvs. = 18.
Nμ 2μ 2 · 11 μ 11 μ
Please click the advert

Download free ebooks at bookboon.com

52
Stochastic Processes 2 4. Queueing theory

Hence
1− 9 1
p1 = 2 · =2· · ,
1+ 11 10
and therefore,
1 18 81
P {both shop assistants busy} = 1 − p0 − p1 = 1 − − = .
10 110 110

Example 4.2 Customers arrive to a shop following a Poisson process of intensity λ. We have 1
shop assistant and it is possible to form a queue. We assume that the service times are exponentially
6
distributed of parameter μ. It is assumed that the traffic intensity is  = , where it is well-known
5
that this implies that the system does not work properly (the queue increases indefinitely). Compare
the advantages of the following two possibilities:
1) Another shop assistant is hired (of the same service time distribution as the first one).

2) Improvement of the service, such that the average service time is lowered to its half.

We have a queueing system with possibility of forming a queue. The parameters are
6
N = 1, = and λ, μ.
5
6
Since  = > 1, this system does not work properly.
5
1) If another shop assistant is hired, then the parameters are changed to
3
N = 2, = and λ, μ unchanged.
5
Then
1− 1
p0 = = .
1+ 4
The average waiting time is
 2
1 3
· ·2
4 5 9 1
V1 =  2 = · ,
2 16 μ
μ·2
5

and the average staying time is


9 1 1 25 1
O1 = · + = · .
16 μ μ 16 μ

Download free ebooks at bookboon.com

53
Stochastic Processes 2 4. Queueing theory

Remark 4.1 It should here be added that one can also find
15
the average number of customers = ,
8
6
the average number of busy shop assistants = ,
5
27
the average length of the queue = . ♦
40

2) If instead the service is improved as indicated, then the parameters become


3
N = 1, = , λ unchanged, μ is doubled.
5
The average waiting time is then
 12 1
V2 = = · ,
2μ(1 − ) 16 μ

and the average staying time is


12 1 1 20 1
O2 = · + = · .
16 μ 2μ 16 μ

Remark 4.2 Again we add for completeness,


3
the average number of customers = ,
5
3
ther average number of busy shop assistants = ,
5
9
the average length of the queue = . ♦
10

By comparing the two cases we get

V 1 < V2 , and on the contrary O1 > O2 ,

and the question does not have a unique answer.

The customer will prefer that the sum of waiting time and service time is as small as possible. Since
34 1 32 1
V 1 + O1 = · and V 2 + O2 = · ,
16 μ 16 μ
it follows that the customer will prefer the latter system, while it is far more uncertain what the shop
would prefer, because we do not know the costs of each of the two possible improvements.

Download free ebooks at bookboon.com

54
Stochastic Processes 2 4. Queueing theory

Example 4.3 We consider an intersection which is not controlled by traffic lights. One has noticed
that cars doing a left-hand turn are stopped and therefore delay the cars which are going straight on.
Therefore, one plans to build a left-hand turn lane. Assuming that arrivals and departures of the cars
λ 1
doing the left-hand turn are exponentially distributed with the parameters λ and μ, where = , one
μ 2
shall compute the smallest number of cars of the planned left-hand turn lane, if the probability is less
than 5 % of the event that there are more cars than the new lane can contain.

Here N = 1, so the capacity of the system is


λ 1
= = .
Nμ 2
The stationary probabilities are
 k+1
k 1
pk =  (1 − ) = , k ∈ N0 .
2
Let n denote the maximum number of cars in the left turn lane. Then we get the condition
∞ ∞  k+1
 1 1 1
pk = = n+1 < 5 % = ,
2 2 20
k=n+1 k=n+1

1 1
thus < , which is fulfilled for n ≥ 4.
2n 10

Example 4.4 Given a queueing system of exponential distribution of arrivals and exponential distri-
1 1
bution of service times (the means are called and , resp.). The number of service places is 2. We
λ μ
1 1
furthermore assume that it is possible to form a queue. Assuming that = 1 (minute) and =1
λ μ
(minute),
1. find the average waiting time,
2. find the average staying time.
For economic reasons the number of service places is cut down from 2 to 1, while the service at the
same time is simplified (so the service time is decreased), such that the customer’s average staying
time is not prolonged. Assuming that the constant λ is unchanged,
1
3. find the average service time , such that the average staying time in the new system is equal to
μ1
the average staying time in the previous mentioned system,
4. find in which of the two systems the probability is largest for a customer to wait.

1 1
Here N = 2, = 1 and = 1. This gives the traffic intensity
λ μ
λ 1 1 1− 1
= = = , and p0 = = .
Nμ 2·1 2 1+ 3

Download free ebooks at bookboon.com

55
Stochastic Processes 2 4. Queueing theory

1) The average waiting time is

1
1 2 1
p0 · N · N N −1 3 · 2 ·2 1
V = = = minute.
1 2
μ · N !(1 − )2 1 · 2! 1 − 2 3

2) The staying time is the waiting time plus the serving time, so the average staying time is
1 1 4
O=V + = + 1 = minute.
μ 3 3

3) In the new system the traffic intensity is

λ 1
1 = = , idet N1 = 1.
N1 μ1 μ1
The average waiting time is for N1 given by some theoretical formula,
1 1
V1 = = ,
μ1 (1 − 1 ) μ1 (μ1 − 1)

and the average staying time is for N1 = 1 given by


1 1
O1 = V1 + = .
μ1 μ1 − 1

your chance
to change
the world
Please click the advert

Here at Ericsson we have a deep rooted belief that


the innovations we make on a daily basis can have a
profound effect on making the world a better place
for people, business and society. Join us.

In Germany we are especially looking for graduates


as Integration Engineers for
• Radio Access and IP Networks
• IMS and IPTV

We are looking forward to getting your application!


To apply and for all current job openings please visit
our web page: www.ericsson.com/careers

Download free ebooks at bookboon.com

56
Stochastic Processes 2 4. Queueing theory

4 3 7
We want that O1 = O = . Hence, μ1 − 1 = , i.e. μ1 = , and
3 4 4
1 4
= .
μ1 7

4) The probability of waiting in the old system is


1− 1− 1 1 1 1
1 − p0 − p1 = 1 − − 2 =1− −2· · = .
1+ 1+ 3 2 3 3
The probability of waiting in the new system is
1 4
1 − p̃0 = 1 − (1 − 1 ) = 1 = = .
μ1 7
We see by comparison that there is largest probability of waiting in the new system.

what‘s missing in this equation?


Please click the advert

You could be one of our future talents

MAERSK INTERNATIONAL TECHNOLOGY & SCIENCE PROGRAMME


Are you about to graduate as an engineer or geoscientist? Or have you already graduated?
If so, there may be an exciting future for you with A.P. Moller - Maersk.

www.maersk.com/mitas

Download free ebooks at bookboon.com

57
Stochastic Processes 2 4. Queueing theory

Example 4.5 Given a service (a shop) of which we assume:

a. There is only one shop assistant.

b. It is not possible to form a queue.

c. The customers arrive according to a Poisson process of intensity λ.

d. The service time is exponentially distributed of mean μ.

1. Find the differential equations of this system.

2. Solve these under the assumption that at time t = 0 there is no customer.


λ
Assume from now on that = 6.
μ
3. Find the stationary probabilities and the probability of rejection.

Assuming that the probability of rejection is too large, we change the system, such that there are two
shop assistants A and B, and the service is changed, such that a customer at his arrival goes to A
and is served by him, if A is vacant at the arrival of the customer. If on the other hand A is busy,
then the customer will turn to B in order to be serviced. If also B is busy, the customer is rejected.
The assumptions of the arrivals and service times are the same as before. We want to compute in this
system:
4. The stationary probabilities and the probability of rejection.

5. The probability that A and B, res., are busy.


6. Finally, find the smallest number of shop assistants, for which the probability of rejection is smaller
1
than .
2

1) Since N = 1, the differential equations of the system are


⎧ 
⎨ P0 (t) = −λP0 (t) + μP1 (t),

P1 (t) = λP0 (t) − μP1 (t),

thus written in the form of a matrix equation,


    
d P0 (t) −λ μ P0 (t)
= .
dt P1 (t) λ −μ P1 (t)

2) The characteristic polynomial (in R) is


! !
! −λ − R μ !
! ! = (R + λ)(R + μ) − λμ = R2 + (λ + μ)R.
! λ −μ − R !

The roots are R = 0 and R = −λ − μ.


For R = 0 we get the eigenvector (μ, λ).

Download free ebooks at bookboon.com

58
Stochastic Processes 2 4. Queueing theory

For R = −λ − μ we get the eigenvector (1, −1).


The complete solution is
     
P0 (t) μ −(λ+μ)t 1
= c1 + c2 e .
P1 (t) λ −1

The initial conditions are P0 (0) = 1 and P1 (0) = 0, thus



⎨ 1 = μc1 + c2 ,

0 = λc1 − c2 ,

and hence
1 λ
c1 = , c2 = ,
λ+μ λ+μ
and the solution becomes

⎪ μ λ

⎪ P (t) = + e−(λ+μ)t ,
⎨ 0 λ+μ λ+μ



⎩ P1 (t) =
λ

λ
e−(λ+μ)t .
λ+μ λ+μ

λ
3) If = 6, then
μ
λ
λ μ 6 μ 1
= λ
= and = ,
λ+μ μ +1 7 λ+μ 7

and λ + μ = 7μ, thus


⎧ 1 6


⎨ P0 (t) = 7 + 7 exp(−7μt),
t ≥ 0.


⎩ P (t) = 6 − 6 exp(−7μt),
1
7 7
The stationary probabilities are obtained by letting t → ∞, thus
1 6
p0 = and p1 = .
7 7
6
In particular, the probability of rejection is p1 = .
7
4) We have the following states:
E0 : No customer in the system.
E1 : A serves a customer, while B does not.
E2 : A is vacant, while B serves a customer.
E3 : Both A and B serve customers.

Download free ebooks at bookboon.com

59
Stochastic Processes 2 4. Queueing theory

There is no change for A, so by 3.,


⎧ 1 6


⎨ P0 (t) + P2 (t) = 7 + 7 exp(−7μt),
t ≥ 0.


⎩ P (t) + P (t) = 6 − 6 exp(−7μt),
1 3
7 7
By taking the limit t → ∞ we get
1 6
p0 + p2 = and p 1 + p3 = .
7 7
We can realize P0 (t + h) in the following ways, if the system at time t is in state

(i) E0 , and no customer arrives,

P0 (t) · {1 − λh + hε(h)}.

(ii) E0 , some customer arrive, and they are served until they are finished,

hε(h).

(iii) E1 , and there is no customer coming, and A’s customer is serviced to the end,

P1 (t) · {μh + hε(h)}.

Turning a challenge into a learning curve.


Just another day at the office for a high performer.

Accenture Boot Camp – your toughest test yet


Please click the advert

Choose Accenture for a career where the variety of opportunities and challenges allows you to make a
difference every day. A place where you can develop your potential and grow professionally, working
alongside talented colleagues. The only place where you can learn from our unrivalled experience, while
helping our global clients achieve high performance. If this is your idea of a typical working day, then
Accenture is the place to be.

It all starts at Boot Camp. It’s 48 hours packed with intellectual challenges and intense learning experience.
that will stimulate your mind and and activities designed to let you It could be your toughest test yet,
enhance your career prospects. You’ll discover what it really means to be a which is exactly what will make it
spend time with other students, top high performer in business. We can’t your biggest opportunity.
Accenture Consultants and special tell you everything about Boot Camp,
guests. An inspirational two days but expect a fast-paced, exhilarating Find out more and apply online.

Visit accenture.com/bootcamp

Download free ebooks at bookboon.com

60
Stochastic Processes 2 4. Queueing theory

(iv) E1 , and there arrive customers, who are served,

hε(h).

(v) E2 , and no new customer is coming, and B’s customer is served to the end,

P2 (t) · {μh + hε(h)}.

(vi) E2 in all other cases,

hε(h).

(vii) E3 in general,

hε(h).

By adding these we get

P0 (t + h) = P0 (t) · {1 − λh + hε(h)} + {P1 (t) + P2 (t)} · {μh + hε(h)} + hε(h).

Then compute the derivative in the usual way by taking the limit. This gives

P0 (t) = lim {P0 (t + h) − P0 (t)} = −λP0 (t) + μ {P1 (t) + P2 (t)} .
h→0

Then by taking the limit t → ∞,

0 = −λp0 + μ {p1 + p2 } = −6μp0 + μ {p1 + p2 } ,

hence

6p0 = p1 + p2 .

We are still missing one equation, when we want to find the stationary probabilities. We choose
to realize P3 (t + h). This can be done, if the system at time t is in state
(i) E0 , and at least two customers arrive,

hε(h).

(ii) E1 , and at least one customer arrives, and neither A nor B finish their customers,

P1 (t) · {λh + hε(h)} · {1 − μh + hε(h)}2 .

(iii) E2 , and at least one customer arrives, and neither A nor B finish their customers,

P2 (t) · {λh + hε(h)} · {1 − μh + hε(h)}2 .

(iv) E3 , and neither A nor B finish their customers,

P3 (t) · {1 − μh + hε(h)}2 .

(v) Other, all of probability

hε(h).

Download free ebooks at bookboon.com

61
Stochastic Processes 2 4. Queueing theory

When we add these probabilities we get

P3 (t + h) = {P1 (t) + P2 (t)} · {λh + hε(h)} · {1 − μh + hε(h)}2


+P3 (t) · {1 − μh + hε(h)}2 + hε(h).

A rearrangement followed by a reduction gives

P3 (t + h) − P3 (t) = λh {P1 (t) + P2 (t)} − 2μhP3 (t) + hε(h).

Then divide by h and let h → 0. This will give us the differential equation

P3 (t) = λ {P1 (t) + P2 (t)} − 2μP3 (t),

hence by taking the limit t → ∞,

0 = λ (p1 + o2 ) − 2μp3 = 6μ (p1 + p2 ) − 2μp3 ,

so

p3 = 3 (p1 + p2 ) = 18p0 .

Summing up we have obtained the four equations


⎧ 1 ⎧ 1

⎪ p0 + p2 = , ⎪
⎪ p0 + p2 = ,

⎪ 7 ⎪
⎪ 7

⎪ ⎪


⎪ ⎪


⎪ 6 ⎪
⎪ 6
⎨ p +p = , ⎨ 18p + p = ,
1 3 0 1
7 thus 7

⎪ ⎪


⎪ ⎪


⎪ 6p0 = p1 + p2 , ⎪
⎪ 6p0 − p1 − p2 = 0,

⎪ ⎪


⎪ ⎪

⎩ ⎩
p3 = 18p0 , p3 = 18p0 .

1
By addition of the former three equations, we get 25p0 = 1, thus p0 = . Then
25
6 18 6 24
p1 = − = (25 − 21) = ,
7 25 175 175
and
1 1 18 18
p2 = − = , and p3 = ,
7 25 175 25
so
 
1 24 18 18
(p0 , p1 , p2 , p3 ) = , , , ,
25 175 175 25

and the probability of rejection is


18
p3 = .
25

Download free ebooks at bookboon.com

62
Stochastic Processes 2 4. Queueing theory

5) The probability that A is busy is


6
p1 + p 3 = .
7
The probability that B is busy is
 
18 18 144 6
p2 + p3 = + = < .
175 25 175 7

6) We have in the general case of N shop assistants, where Ej denotes that j customers are served,
the system of differential equations
⎧ 

⎪ P0 (t) = −λP0 (t) + μP1 (t),



Pk (t) = −(λ + kμ)Pk (t) + λPk−1 (t) + (k + 1)μPk+1 (t), 1 ≤ k ≤ N − 1,




⎩ 
PN (t) = −N μPN (t) + λPN −1 (t).

Hence by taking the limit t → ∞,



⎪ 0 = −λp0 + μp1 ,




0 = −(λ + kμ)pk + λpk−1 + (k + 1)μpk+1 , 1 ≤ k ≤ N − 1,





0 = −N μpN + λpN −1 .

λ
Since = 6, we get by a division by μ, followed by a rearrangement that
μ


⎪ 0 = 6p0 − p1 ,



6pk − (k + 1)pk+1 = 6pk−1 − kpk , 1 ≤ k ≤ N − 1,





0 = 6pN −1 − N pN .

Then by recursion, 6pk−1 − k pk = 0, thus

kpk = 6pk−1 , 1 ≤ k ≤ N.

The easiest way to solve this recursion formula is to multiply by

(k − 1)!
= 0,
6k
and then do the recursion,

k! (k − 1)! 0!
k
pp = k−1
pk−1 = · · · = 0 p0 = p0 , k = 0, 1, . . . , N,
6 6 6
thus
6k
pk = p0 , k = 0, 1, . . . , N.
k!

Download free ebooks at bookboon.com

63
Stochastic Processes 2 4. Queueing theory

Since p is a probability vector, we get the condition


N
 N
 6k 1
1= pk = p 0 , thus p0 = .
k=0 k=0
k!
N 6k
k=0
k!
1
The task is to find N , such that the probability of rejection pN ≤ . Using
2
6N N
 −1 k
N! 1 6N 6
pN =
N −1 6k 6N
≤ , if ≤ ,
k=0 k! + 2 N! k!
N! k=0

we compute the following table,

k 0 1 2 3 4
6k
1 6 18 36 54
k! j

k−1 6
j=0 1 7 25 61
j!

1
It follows that N ≥ 4 gives pN ≤ , so we shall at least apply 4 service places.
2



    
  
Please click the advert

In Paris or Online
International programs taught by professors and professionals from all over the world

BBA in Global Business


MBA in International Management / International Marketing
DBA in International Business / International Management
MA in International Education
MA in Cross-Cultural Communication
MA in Foreign Languages

Innovative – Practical – Flexible – Affordable

Visit: www.HorizonsUniversity.org
Write: [email protected]
Call: 01.42.77.20.66 www.HorizonsUniversity.org

Download free ebooks at bookboon.com

64
Stochastic Processes 2 4. Queueing theory

Example 4.6 At a university there are two super computers A and B. Computer A is used for
university tasks, while computer B is restricted to external tasks. Both systems allow forming queues,
and the service times (i.e. the times used for computation of each task) is approximately exponentially
1
distributed of mean = 3 minutes. The university tasks arrive to computer A approximately as a
μ
1
Poisson process of intensity λA = min−1 , while the tasks of computer B arrive as a Poisson process
5
3 −1
of intensity λB = min . Apply the stationary probabilities for the two computers A and B to
10
compute

1. The fraction of time, A (resp. B) is vacant.

2. The average waiting time at A (resp. B).

It is suggested to join the two systems to one, such that each computer can be used to university tasks
as well external tasks. This means that we have a queueing system with two “shop assistants”. Use
again the stationary probabilities of this system to compute

3. The fraction of time both computers are vacant.

4. The fraction of time both computers are busy.

5. The average waiting time.

1) In both cases, N = 1.
For A we have the capacity
λA 3 2
A = = , thus p0,A = 1 − A = .
N μA 5 5
For B we have the capacity
λB 9 1
B = = , thus p0,B = 1 − B = .
N μB 10 10
These probabilities indicate the fraction of time, in which the given computer is vacant.
2) Since N = 1, the respective average waiting times are
3
A 9
VA = =3· 5 3 = minutes,
μ (1 − A ) 1− 5
2

and
9
B
VB = = 3 · 10 9 = 27 minutes.
μ (1 − B ) 1 − 10

3) The sum of two Poisson processes is again a Poisson process, here with the parameter
1 3 1
λ = λ A + λB = + = .
5 10 2

Download free ebooks at bookboon.com

65
Stochastic Processes 2 4. Queueing theory

Hence the capacity


λ 1 1 3
= = · ·3= .
Nμ 2 2 4
The fraction of time, in which none of the computers is busy, is
3
1− 1− 4 1
p0 = = 3 = .
1+ 1+ 4
7

4) The probability that both computers are busy is


1 1− 1 3 1 14 − 2 − 3 9
1 − p 0 − p1 = 1 − − 2 =1− −2· · = = .
7 1+ 7 4 7 14 14

5) The average waiting time is


 2
p0 N N N −1 1 3 1 1 1 9 3 27
V = = · 21 · 3 · · = · · 2 · · 16 = minutes.
μ · N !(1 − )2 7 4 2! 1 − 3 2 7 16 2 7
4

Brain power By 2020, wind could provide one-tenth of our planet’s


electricity needs. Already today, SKF’s innovative know-
how is crucial to running a large proportion of the
world’s wind turbines.
Up to 25 % of the generating costs relate to mainte-
nance. These can be reduced dramatically thanks to our
systems for on-line condition monitoring and automatic
lubrication. We help make it more economical to create
Please click the advert

cleaner, cheaper energy out of thin air.


By sharing our experience, expertise, and creativity,
industries can boost performance beyond expectations.
Therefore we need the best employees who can
meet this challenge!

The Power of Knowledge Engineering

Plug into The Power of Knowledge Engineering.


Visit us at www.skf.com/knowledge

Download free ebooks at bookboon.com

66
Stochastic Processes 2 4. Queueing theory

Example 4.7 Given a birth and death process of the states E0 , E1 , E2 , . . . , where the birth intensity
λk in state Ek decreases in increasing k as follows,
α
λk = ,
k+1
where α is a positive constant, while the death intensities μk are given by

⎨ μ, k ∈ N,
μk = where μ > 0.

0, k = 0,

1. Find the stationary probabilities.


The above may be viewed as a model of a queueing process, where
a. it is possible to form a queue,
b. there is only 1 channel,
1
c. the service time is exponentially distributed of mean ,
μ
d. the arrival frequency decreases with increasing queue length according to the given formula. (Some
customers will avoid a long queue and immediately leave the queue ).
2. Compute for α = μ the probability that there are at most 3 customers in the system (3 dec.).
3. Compare the probability of 2. with the corresponding probability in the case of one shop assistant
and λk = α constant and μ = 3α (3 dec.).

α
1) The system of differential equations for λk = and μ > 0 is given by
k+1
⎧ 

⎪ P (t) = −αP0 (t) + μP1 (t),
⎨ 0
 

⎪  α α
⎩ Pk (t) = − + μ Pk (t) + Pk−1 (t) + μ Pk+1 (t), k ∈ N.
k+1 k

By taking the limit t → ∞ we get




⎪ 0 = −αp0 + μp1 ,

 

⎪ α α
⎩ 0 = − + μ pk + pk−1 + μpk+1 , k ∈ N,
k+1 k

thus
α α
− pk + μpk+1 = − pk−1 + μpk = · · · = 0, k ∈ N,
k+1 k
and hence
α
μpk = pk−1 , k ∈ N.
k

Download free ebooks at bookboon.com

67
Stochastic Processes 2 4. Queueing theory

When this equation is multiplied by

μk−1
k! = 0,
αk
it follows by a recursion that
μ k μ k−1 μ 0
k! pk = (k − 1)! pk−1 = · · · = 0! p0 = p 0 ,
α α α
hence
 k
1 α
pk = p0 , k ∈ N0 .
k! μ
It follows from

 ∞
  k  
1 α α
1= pk = p 0 = p0 exp ,
k! μ μ
k=0 k=0

that
 
α
p0 = exp − ,
μ
thus
 k  
1 α α
pk = exp − , k ∈ N0 .
k! μ μ

2) Put α = μ. The probability that there are at most 3 customers in the system is
 
1 1 1 1 16
p0 + p1 + p2 + p3 = 1+ + + = ≈ 0.9810.
e 1! 2! 3! 6e

3) The differential equations of the new system are


⎧ 
⎨ P0 (t) = −αP0 (t) + 3αP1 (t),

Pk (t) = −4αPk (t) + αPk−1 (t) + 3αPk+1 (t), k ∈ N.

By taking the limit t → ∞ we get the equations of the stationary probabilities,



⎨ 0 = −αp0 + 3αp1 ,

0 = −4αpk + αpk−1 + 3αpk+1 , k ∈ N.

We rewrite these and get by a reduction,

3pk+1 − pk = 3pk − pk−1 = · · · = 3p1 − p0 = 0, k ∈ N,

thus 3pk = pk−1 . Multiply this equation by 3k−1 in order to get

3k pk = 3k−1 pk−1 = · · · = 30 p0 = 00 ,

Download free ebooks at bookboon.com

68
Stochastic Processes 2 4. Queueing theory

hence
1
pk = p0 , k ∈ N0 .
3k
It follows from

 ∞  k
 1
1 3 3
1= pk = p 0 = p0 · 1 = · p0 ,
k=0 k=0
3 1− 3
2

2
that p0 = , and the probability that there are at most three customers in this system is
3
 
1 1 1 2 27 + 9 + 3 + 1 80
p0 + p1 + p2 + p3 = p0 1 + + 2 + 3 = · = ≈ 0.9877.
3 3 3 3 27 81

There is a slightly higher probability in this case that there are at most three customers in this
system than in the system which was considered in 2..
Please click the advert

The financial industry needs a strong software platform


That’s why we need you
SimCorp is a leading provider of software solutions for the financial industry. We work together to reach a common goal: to help our clients
succeed by providing a strong, scalable IT platform that enables growth, while mitigating risk and reducing cost. At SimCorp, we value
commitment and enable you to make the most of your ambitions and potential.
Find your next challenge at
Are you among the best qualified in finance, economics, IT or mathematics? www.simcorp.com/careers

www.simcorp.com

MITIGATE RISK REDUCE COST ENABLE GROWTH

Download free ebooks at bookboon.com

69
Stochastic Processes 2 4. Queueing theory

Example 4.8 Given the following queueing model: M machines are working mutually independently
of each other and they need no operation by men, except in the case when they break down. There are
in total N service mechanics (where N < M ) for making repairs. If a machine is working at time t,
it is the probability λh + hε(h) that it breaks down before time t + h, and probability 1 − λh + hε(h)
that it is still working. Analogously, if it is repaired at time t, then there is the probability μh + hε(h)
that it is working again before t + h, and probability 1 − μh + hε(h) that it is not working. When a
machine breaks down, it is immediately repaired by a service mechanic, if he is vacant. Otherwise, the
machine is waiting in a queue, until a service mechanic becomes vacant. We define the coefficient of
loss of a machine as
1
· average number of machines in the queue,
M
and the coefficient of loss of a service mechanic as
1
· average number of vacant service mechanics.
N
Denote by Ek the state that k machines do not work, k = 0, 1, . . . , M .

1) Prove that the constants λk and μk are given by

λk = (M − k)λ, μk = kμ, 0 ≤ k ≤ N,

λk = (M − k)λ, μk = N μ, N ≤ k ≤ M.

2) Find a recursion formula for pk (express pk+1 by pk ).


3) Find the average number of machines in the queue (expressed by the p k -erne), and prove in par-
ticular that if N = 1 this can be written
λ+μ
M− (1 − p0 ) .
λ

4) Find the probability that there are precisely 0, 1, 2, . . . , N vacant service mechanics.

5) Find the coefficients of loss of a machine and a service mechanics in the case of

λ
= 0, 1; M = 6; N = 1.
μ
It should be mentioned for comparison that in the case when
λ
= 0, 1; M = 20; N = 3,
μ
the coefficient of loss of a machine is 0.0169 and the coefficient of loss of a service mechanics is
0.4042. Which one of the two systems is best?

This problem of machines was first applied in the Swedish industry.

Download free ebooks at bookboon.com

70
Stochastic Processes 2 4. Queueing theory

1) Let 0 ≤ k ≤ M , and assume that we are in state Ek , thus k machines are being repaired or
are waiting for reparation, and M − k machines are working. The latter machines have each the
probability
λh + hε(h)
of breaking down in the time interval ]t, t + h] of length h. Since M − k machines are working, we
get
λk = (M − k)λ for 0 ≤ k ≤ M.
If we are in state Ek , where 0 ≤ k ≤ N , then all k machines are being repaired. Each of these
have the probability
μh + hε(h)
for being repaired before time t + h, thus
μk = kμ, for 0 ≤ k ≤ N.
If instead N < k ≤ M , then all service mechanics are working, so
μk = N μ, for N < k ≤ M.

2) By a known formula,
μk+1 pk+1 = λk pk ,
thus
λk
pk+1 = pn , for n = 0, 1, . . . , M − 1.
μk+1
When we insert the results of 1., we get

⎪ (M − k)λ

⎪ p = pk for k = 0, 1, . . . , N − 1,
⎨ k+1 (k + 1)μ


⎩ pk+1 = (M − k)λ pk

for k = N, . . . , M − 1.

When the first equation is multiplied by
1  μ k+1
  ,
M λ
k+1
we get
 
M
pk+1 (M − k)λ k 1 pk
   k+1 = · · · 
M λ (k + 1)μ M λ M
k+1 μ k+1 μ k
pk p0
=    k = · · · =    0 = p0 ,
M λ M λ
k μ 0 μ

Download free ebooks at bookboon.com

71
Stochastic Processes 2 4. Queueing theory

hence
   k
M λ
pk = p0 for k = 0, 1, . . . , N.
k μ
We put n = N + m, m = 0, 1, . . . , M − N − 1, into the second equation. Then
 m+1
M −N −m λ 1 λ
pN +m+1 = · pN +m = m+1 · (M − N − m) · · · (M − N )pN
Mμ μ N μ
 m+1
1 λ (M − N )!
= · pN ,
N m+1 μ (M − N − m − 1)!

hence
 m  N +m
1 λ (M − N )! M! 1 λ
pN +m = · pN = · m p0 ,
Nm μ (M − N − m)! N !(M − N − m)! N μ
for m = 0, 1, . . . , M − N .
3) The average number of machines in the queue is
M
 M

(k − N )pk = (k − N )pk .
k=N +1 k=N

We get in particular for N = 1,


M
 M
 M
 M

(k − 1)pk = kpk − pk = kpk − (1 − p0 ) .
k=1 k=1 k=1 k=1

Then by the recursion formula of 2.,


λ λ λ
pk+1 = (M − k) pk = M p k − p k , k = 1, . . . , M − 1.
μ μ μ
Hence
M
 M
 −1 M
 −1 M −1 M M
μ   μ
kpk = kpk + M pM = M pk + M p M −
pk+1 = M pk − M p 0 − pk
λ λ
k=1 k=1 k=1 k=1 k=0 k=2
μ μ μ
= M (1 − p0 ) − (1 − p0 − p1 ) = M − (1 − p0 ) − M p0 + p1 .
λ λ λ
It follows from
M −0 λ λ
p1 = · p 0 = M · p0 ,
0+1 μ μ
by insertion that the average number of machines in the queue is for N = 1 given by
M
 M
 μ μ λ
(k − 1)pk = kpk − (1 − p0 ) = M − m (1 − p0 ) − M p0 + · M · p0 − (1 − p0 )
λ λ μ
k=1 k=1
μ λ+μ
= M− + 1 (1 − p0 ) = M − (1 − p0 ) .
λ λ

Download free ebooks at bookboon.com

72
Stochastic Processes 2 4. Queueing theory

4) If there are n ∈ {1, 2, . . . , N } vacant service mechanics, the system is in state E N −n , so the
probability is
   N −n
M λ
pN −n = p0 , n = 1, 2, . . . , N.
N −n μ

If there is no vacant service mechanic, we get the probability


N 
   N −n N
 −1    n
M λ M λ
1− p0 = 1 − p 0 .
N −n μ n μ
n=1 n=0

λ 1
5) If = , M = 6 and N = 1, then the coefficient of loss of the machine is by 3. given by
μ 10
1   μ  1 11
· M − 1+ (1 − p0 ) = 1 − (1 + 10) · (1 − p0 ) = 1 − (1 − p0 ) .
M λ 6 6
We shall only find p0 . We get by using the recursion formulae
6 5 4
p1 = p0 , p2 = p1 , p3 = p2 ,
10 10 10
3 2 1
p4 = p3 , p5 = p4 , p6 = p5 ,
10 10 10
hence
6
       
6 5 4 3 2 1
1 = pk = p 0 1+ 1+ 1+ 1+ 1+ 1+
10 10 10 10 10 10
k=0
≈ p0 · 2.0639,

so

p0 ≈ 0.4845.

We also get by insertion the coefficient of loss of the machine,


11
1− (1 − p0 ) ≈ 0.05049.
6
The loss coefficient of the service mechanic is
1
· p0 = p0 ≈ 0.4845.
N
By comparison we see that the coefficients of loss are smallest in the system, where
λ 1
= , M = 20, N = 3,
μ 10
so this system is the best.

Download free ebooks at bookboon.com

73
Stochastic Processes 2 4. Queueing theory

1
Example 4.9 In a shop the service time is exponentially distributed of mean , thus the frequency
μ
is given by

⎨ μ e−μx , x > 0,
f (x) =

0, x ≤ 0.

Let X1 , X2 , . . . denote the service times of customer number 1, 2, . . . . We assume that the X i are
mutually independent and that they all have the frequency f (x) above.
In total there arrive to the shop N customers, where N is a random variable, which is independent of
all the Xi , and N can have the values 1, 2, . . . , of the probabilities

P {N = k} = p q k−1 , k ∈ N,

where p > 0, q > 0, and p + q = 1.



n
1) Prove that Yn = i=1 Xi has the frequency


⎪ (μx)n−1 −μx
⎨ μ e , x > 0,
fn (x) = (n − 1)!



0, x ≤ 0.


N
2) Find the frequency and the distribution function of Y = i=1 Xi by using that


P {Y ≤ x} = P {N = k ∧ Yk ≤ x} .
k=1

3) Find mean and variance of Y .



1
1) Since Xi ∈ Γ 1, , it follows that
μ
n
  
1
Yn = Xk ∈ Γ n, ,
μ
k=1

and the frequency is




⎪ (μx)n−1 −μx
⎨ μ e , x > 0,
fn (x) = (n − 1)!



0, x ≤ 0.

2) It follows immediately (without using generating functions),



 ∞
 ∞
  x
P {Y ≤ x} = P {N = k, Yk ≤ x} = P {N = k} · P {Yk ≤ x} = pq n−1 fn (t) dt.
k=1 k=1 n=1 0

Download free ebooks at bookboon.com

74
Stochastic Processes 2 4. Queueing theory

Thus we get for x > 0 the frequency



 ∞
 ∞
μ (qμx)n −μx
g(x) = p q n−1 fn (x) = p q n−1 · (μx)n−1 e−μx = pμ e
n=1 n=1
(n − 1)! n=0
n!
= pμ e+qμx · e−μx = pμ · e−pμx ,
 
1
so Y ∈ Γ 1, is exponentially distributed of frequency


⎨ pμ e−pμx for x > 0,
g(x) =

0 for x ≤ 0,

and distribution function



⎨ 1 − e−pμx for x > 0,
G(x) =

0 for x ≤ 0.
 
1
3) Since Y ∈ Γ 1, , we have

1 1
E{X} = og V {X} = .
pμ p2 μ2
Please click the advert

Download free ebooks at bookboon.com

75
Stochastic Processes 2 4. Queueing theory

Example 4.10 An old-fashioned shop with one shop assistant to serve the customers can be con-
sidered as a queuing system of one channel with the possibility of forming a queue. The customers
arrive according to a Poisson process of intensity λ, and the service time is exponentially distributed
of parameter μ. It has been noticed that when the system is in its equilibrium, then the shop assistant
3
is in mean busy of the time, and the average staying time of customers is 10 minutes.
4
1 1 1 1
1. Prove that = hour and = hour.
λ 18 μ 24
2. Find the probability that a customer is served immediately.
3. Find the average queue length.
The shop is closed at 1730 and only the customers who are already in the shop are served by the shop
assistant, before he leaves for his home.
4. Find the probability that there at 1730 are 0, 1, 2, . . . customers in the shop.
5. Led the random variable T denote the time from 1730 until the shop assistant has served all cus-
tomers. Find the distribution of T .

It follows from λk = λ and μk = μ that

μpk+1 = λpk , n ∈ N0 .

The traffic intensity is


λ λ
= = ,
Nμ μ
which we assume satisfies  < 1, so p0 = 1 − . Thus
 k
λ λ
pk = pk−1 = · · · = p0 = k · (1 − ).
μ μ
1) The staying time is
1 1
O= = 10 minutes = hour,
μ−λ 6
and the shop assistant is busy
3 λ
= 1 − p0 =  = .
4 μ
3 1 3
Hence λ = μ and 6 = μ − λ = μ, thus μ = 24 and λ = · 24 = 18, corresponding to
4 4 4
1 1 1 1
= hour and = hour.
λ 18 μ 24

2) A customer is immediately served if the system is in state E0 . The probability of this event is
3 1
p0 = 1 −  = 1 − = .
4 4

Download free ebooks at bookboon.com

76
Stochastic Processes 2 4. Queueing theory

3) The average queue length is


9
2 16 9
= = .
1− 3 4
1−
4

4) The probability that there are n customers in the shop at 1730 (t ≈ ∞) is


 n
n 1 3
pn =  (1 − ) = · .
4 4

5) Assume
 that there are k customers in the shop. Then the service time is Erlang distributed,
1
Γ k, , of frequency
μ

(μx)k−1 −μx
μ· e , x > 0, k ∈ N.
(k − 1)!

It follows that the distribution of T is given by


1
P {T = 0} =
4
and

  k ∞  k−1
1 3 (μx)k−1 −μx 1 3 3 1
FT (x) = μ· e = · μ·e −μx
μx ·
4 4 (k − 1)! 4 4 4 (k − 1)!
k=1 k=1
   
3 −μx 3 3 1
= μe exp μx = μ · exp − μx .
16 4 16 4

Then by an integration,
⎧  μ
⎪ 3
⎨ 1 − exp − x , x ≥ 0,
P {T ≤ x} = 4 4


0, x < 0.

When we insert μ = 24, found above, we get



⎪ 3
⎨ 1 − e−6x , x ≥ 0,
P {T ≤ x} = 4


0, x < 0.

Alternatively, T has the Laplace transform

LT (λ) = P (L(λ)),

where
μ
L(λ) =
λ+μ

Download free ebooks at bookboon.com

77
Stochastic Processes 2 4. Queueing theory

and

 ∞  k
k 1 3 1 1 1
P (s) = pk s = s = · = .
4 4 4 3 4 − 3s
k=0 k=0 1− s
4
Hence by insertion,
1
1 λ+μ 1 3 μ
LT (λ) = = = ·1+ · 4 .
3μ 4λ + μ 4 4 1
4− λ+ μ
λ+μ 4
We recognize this Laplace transform as corresponding to
⎧  μ
⎪ 3
⎨ 1 − exp − x , x ≥ 0,
FT (x) = 4 4


0, x < 0.

Try this...
Please click the advert

Challenging? Not challenging? Try more www.alloptions.nl/life


Download free ebooks at bookboon.com

78
Stochastic Processes 2 4. Queueing theory

Example 4.11 Given a service, where we assume:


a. There are two channels.
b. The customers arrive by a Poisson process of intensity 1 min−1 .
c. The service time is at each of the two channels exponentially distributed of mean 1 minute.
d. It is possible to form a queue.
1. Compute the average waiting time.
2. Find the fraction of time, in which both channels are vacant, and the fraction of time, in which
both channels are busy.
The flow of customers is then increased such that the customers now arrive according to a Poisson
process of intensity λ = 2 min−1 (the other assumptions are unchanged).
3. What is the impact of this change on the service?
The service is then augmented by another channel of the same type as the old ones.
4. Compute in this system for λ = 2 the average waiting time.

1) The process is described by a birth and death process with

λk 1 and μ1 = 1, μk = 2 for k ≥ N = 2, thus μ = 1.

The traffic intensity is


λ 1
= = .
Nμ 2
We have
 k−1
1− 1 1− 1 1
p0 = = og pk = 2k · = for k ∈ N.
1+ 3 1+ 3 2

The waiting time is given by


 2
p0 · N · N N −1 1 1 1 1
V = = · · 2 = .
μ · N !(1 − )2 3 2 1 3
1−
2

2) Both channels are vacant in the fraction of time


1
p0 = .
3
Both channels are busy in the fraction of time

 1 1 1
pk = 1 − p 0 − p 1 = 1 − − = .
3 3 3
k=2

Download free ebooks at bookboon.com

79
Stochastic Processes 2 4. Queueing theory

3) The only change in the new system is λk = 2, thus

λk = 2 and μ1 = 1, μk = 2 for k ≥ 2, and μ = 1.

The traffic intensity is


λ 2
= = = 1.
Nμ 2·1
The queue will increase indefinitely.

4) Then we shift to N = 3 with λ = 2 and μ = 1, so

λk = 2, μ1 = 1, μ2 = 2 and μk = 3 for k ≥ 3.

The traffic intensity is


λ 2 2
= = = .
Nμ 3·1 3
It follows from

⎪ 1 k
⎪ k
⎨  · k! N p0 ,
⎪ k < N,
pk =

⎪ N
⎪ k N
⎩  · p0 , k ≥ N,
N!
that
 2 2
2 1 2 3
p1 = · · 3p0 = 2p0 and p2 = · p0 = 2p0 ,
3 1! 3 2!

and
 k 3  k−1
2 3 2
pk = · p0 = 2 p0 for k ≥ 3.
3 3! 3

The sum is

 ∞  k−2
  ∞  k−2

  2  2
1= pk = p 0 1+2+2+2 = p0 3+2 = 9p0 ,
3 3
k=0 k=3 k=2

1
from which p0 = . The waiting time is obtained by insertion,
9
 3  3
2 2 2
· 3  2
p0 N · N N −1 1 3 3 2 4
V = 2
= ·  2 =  2 = = .
μ · N !(1 − ) 9 2 1 3 9
1 · 3! 1 − 3·2
3 3

Download free ebooks at bookboon.com

80
Stochastic Processes 2 4. Queueing theory

Example 4.12 Given a service for which


a. There are three channels.
b. The customers arrive according to a Poisson process of intensity 1 min −1 .
c. The service time for each channel is exponentially distributed of mean 1 minute.
d. It is possible to form a queue.
1. Prove that the stationary probabilities are given by
⎧ 4 1

⎪ · , k < 3,

⎨ 11 k!
pk =  k−3


⎪ 2 · 1
⎩ , k ≥ 3.
33 3

2. Find the fraction of time, in which all three channels are busy.
3. Compute the average length of the queue.
Decrease the number of channels to two while the other assumptions are unchanged. Compute in this
system,
4. the stationary probabilities,
5. the fraction of time, in which both channels are busy,
6. the average length of the queue.
Finally, decrease the number of channels to one, while the other assumptions are unchanged.
7. How will this system function?

1) The traffic intensity is


λ 1 1
= = = .
Nμ 3·1 3
It follows from
⎧ 1 k
⎪ k
⎨  · k! N p0 ,
⎪ k < N,
pk =


⎩ 1 k N N p , k ≥ N,
0
N!
that
 k
1 1 1
pk = · 3k · p0 = p0 for k = 0, 1, 2, 3,
3 k! k!
and
 k 3  k−3
1 3 1 1
pk = · p0 = p0 for k ≥ 3,
3 3! 6 3

Download free ebooks at bookboon.com

81
Stochastic Processes 2 4. Queueing theory

hence
⎧ ⎫
 ∞  k−3
 ⎪ ⎪  

 1 1 1 ⎨5 1 1 ⎬ 5 1 11
1= pk = p 0 1+1+ + = p0 + · = p0 + = p0 ,
2 6 3 ⎪
⎩2 6 1⎪ 2 4 4
k=0 k=3 1− ⎭
3
4
from which p0 = , thus
11
⎧ 4 1

⎪ · , k = 0, 1, 2,

⎨ 11 k!
pk =  k−3




2 1
, k ≥ 3.
33 3

2) The fraction of time, in which all three channels are busy, is given by

 ∞  k−3
2  1 2 1 2 3 1
pk = = · = · = .
33 3 33 1 33 2 11
k=3 k=3 1−
3
Alternatively, it is given by
4 4 1 4 1 1
1 − p 0 − p1 − p2 = 1 − − − = .
11 11 1! 11 2! 11

Fast-track
your career
Please click the advert

Masters in Management Stand out from the crowd


Designed for graduates with less than one year of full-time postgraduate work
experience, London Business School’s Masters in Management will expand your
thinking and provide you with the foundations for a successful career in business.
The programme is developed in consultation with recruiters to provide you with
the key skills that top employers demand. Through 11 months of full-time study,
you will gain the business knowledge and capabilities to increase your career
choices and stand out from the crowd.
London Business School Applications are now open for entry in September 2011.
Regent’s Park
London NW1 4SA
United Kingdom
Tel +44 (0)20 7000 7573
For more information visit www.london.edu/mim/
Email [email protected] email [email protected] or call +44 (0)20 7000 7573
www.london.edu/mim/

Download free ebooks at bookboon.com

82
Stochastic Processes 2 4. Queueing theory

3) The average length of the queue is



 ∞
  k−3 ∞  k
2 1 2  1
(k − 3)pk = (k − 3) · = k
33 3 33 3
k=4 k=4 k=1
2 1 1 2 1 9 1
= · · 2 = · · = .
33 3 1 33 3 4 22
1−
3

1
4) If N = 2, then  = . The stationary probabilities are
2
⎧ 1

⎪ 1− 2 1

⎪ = , k = 0,

⎨ +
1 1 3
3
pk =

⎪  k 1

⎪ 1 1− 1 1

⎩ 2 · 2
1 = · , k ∈ N.
2 1+ 2
3 2k−1

5) The fraction of times, in which both channels are busy, is


1 1 1
1 − p 0 − p1 = 1 − − = .
3 3 3

Download free ebooks at bookboon.com

83
Stochastic Processes 2 4. Queueing theory

6) The average length of the queue is



 ∞
  3 
∞  k−1
2 1 2 1 1 1 1 1
(k − 2)pk = (k − 2) · · k = · k = · 2 = .
3 2 3 2 2 12 1 3
k=3 k=3 k=1 1−
2

7) If there is only one channel, the traffic intensity becomes  = 1, and the queue is increasing
indefinitely.

Example 4.13 A shop serves M customers, and there is one shop assistant in the shop. It is possible
1
to form a queue. We assume that the service time is exponentially distributed of mean . Assume
μ
also that if a customer is not in the shop at time t, then there is the probability λh + hε(h) [where λ
is a positive constant] that this customer arrives to the shop before the time t + h. Finally, assume
that the customers arrive to the shop mutually independent of each other. Thus we have a birth and
death process {X(t), t ∈ [0, ∞[} of the states E0 , E1 , . . . , EM , where Ek denotes the state that there
are k customers in the shop, k = 0, 1, 2, . . . , M .
1) Prove that the birth intensities λk and death intensities μk , k = 0, 1, 2, . . . , M , are given by

⎨ 0, k = 0,
λk = (M − k)λ, μk =

μ, k = 1, 2, . . . , M.

2) Find the equations of the stationary probabilities pk , k = 0, 1, 2, . . . , M .


3) Express the stationary probabilities pk , k = 0, 1, 2, . . . , M , by means of p0 .
4) Compute the stationary probabilities pk , k = 0, 1, 2, . . . M .
5) Find, expressed by the stationary probability p0 , the average number of customers, who are not in
the shop.
λ
6) Compute the stationary probabilities, first in the case, when = 1 and M = 5, and then in the
μ
λ 1
case, when = and M = 5.
μ 2

1) If we are in state Ek , then M − k of the customers are not in the shop. They arrive to the shop
before time t + h of probability
(M − k){λ + ε(h)}h,
(a time interval of length h, and we divide by h before we go to the limit h → 0). Hence, the birth
intensity is
λk = (M − k)λ, k = 0, 1, . . . , M.
If we are in state E0 , then no customer is served, so μ0 = 0.
In any other state precisely one customer is served with the intensity μ, so
μk = μ, k = 1, 2, . . . , M.

Download free ebooks at bookboon.com

84
Stochastic Processes 2 4. Queueing theory

2) The equations of the stationary probabilities are

μk+1 pk+1 = λk pk .

Thus, in the explicit case,


λ
pk+1 = (M − k) pk .
μ

3) We get successively
 2
λ λ
p0 = p 0 , p 1 = M · p0 , p2 = M (M − 1) p0 ,
μ μ

and in general
 k
M! λ
pk = p0 , k = 0, 1, 2, . . . , M.
(M − k)! μ

4) It follows from the equation


 k  M 
1 μ
M
 M
 M k
1 λ λ
1= pk = M ! p0 = p 0 · M ! p0
(M − k)! μ μ k! λ
k=0 k=0 k=0

that
μ M
1 λ
p0 =  M −k =
M 1  μ k
,

M 1 λ M ! k=0
M! k=0 k! λ
k! μ

and hence
μ   M 
M
 2  k
λ λ λ M! λ λ
1, M , M (M − 1) ,··· , , · · · , M!
1 μ
p =

M k μ μ (M − k)! μ μ
M! k=0
k! λ
⎛ ⎞
μ M  μ M −1  μ M −k
1 ⎜ λ ⎟
⎝ , λ ,··· , λ , · · · , 1⎠ .
1 μ
=

M k M! (M − 1)! (M − k)!
M! k=0
k! λ

5) The average number of customers who are not in the shop is by e.g. 3.,

M
 M
 −1  k M
  k−1
M! λ M! λ
(M − k)pk = p0 = p0
(M − k − 1)! μ (M − k)! μ
k=0 k=0 k=1
M
μ μ
= pk = (1 − p0 ) .
λ λ
k=1

Download free ebooks at bookboon.com

85
Stochastic Processes 2 4. Queueing theory

λ
6) If = 1 and M = 5, then
μ
5
 5!
1= p0 = {1 + 5 + 20 + 60 + 120 + 120}p0 = 326p0 ,
(5 − k)!
k=0

and
1
p= (1, 5, 20, 60, 120, 120).
326

λ 1
7) Når = og M = 5, er
μ 2
5
  k  
5! 1 5 15 15 15 109
1= p0 = 1 + + 5 + + + p0 = p0
(5 − k)! 2 2 2 2 4 4
k=0

and
 
4 5 15 15 15 1
p= 1, , 5, , , = (4, 10, 20, 30, 30, 15).
109 2 2 2 4 109

© UBS 2010. All rights reserved.


You’re full of energy
and ideas. And that’s
just what we are looking for.
Please click the advert

Looking for a career where your ideas could really make a difference? UBS’s
Graduate Programme and internships are a chance for you to experience
for yourself what it’s like to be part of a global team that rewards your input
and believes in succeeding together.

Wherever you are in your academic career, make your future a part of ours
by visiting www.ubs.com/graduates.

www.ubs.com/graduates

Download free ebooks at bookboon.com

86
Stochastic Processes 2 4. Queueing theory

Example 4.14 Given two queueing systems, A and B, which are mutually independent. We assume
for each of the two systems:
a. there is one channel,
b. it is possible to form a queue,
c. the customers arrive according to a Poisson process of intensity λ,
d. the service times are exponentially distributed of parameter μ,
λ 1
e. the traffic intensity is  = = .
μ 2
Denote by X1 the random variable which indicates the number of customers in system A, and by X 2
the random variables which indicates the number of customers in system B.
1. Compute by using the stationary probabilities,

P {X1 = k} and P {X2 = k} , k ∈ N0 .

Let Z = X1 + X2 denote the total number of customers in the two systems.


2. Compute P {Z = k}, k ∈ N0 .
3. Compute the mean of Z
Consider another queuing system C, in which we assume,
a. there are two channels,
b. it is possible to form a queue,
c. the customers arrive according to a Poisson process of intensity 2λ,
d. the service times are exponentially distributed of the parameter μ,
2λ 1
e. the traffic intensity is  = = .
2μ 2
Let the random variable Y denote the number of customers in system C.
4. Compute by using the stationary probabilities,

P {Y = k} and P {Y > k}, k ∈ N0 .

5. Compute the mean of Y .


6. Prove for all k ∈ N0 that

P {Z > k} > P {Y > k}.

Hint to 6.: One may without proof use the formula,



 xN −1 {N − (N − 1)x}
i xi−1 = , |x| < 1, N ∈ N.
(1 − x)2
i=N

Download free ebooks at bookboon.com

87
Stochastic Processes 2 4. Queueing theory

1
1) The two queueing systems follow the same distribution, and N = 1 and  = , so we get by a
2
known formula,
 k+1
1
P {X1 = k} = P {X2 = k} = pk = k · (1 − ) = , k ∈ N0 .
2

2) A straightforward computation gives

k
 k  j+1  k−j+1
1 1
P {Z = k} = P {X1 = j} · P {X2 = k − j} = ·
j=0 j=0
2 2
 k+2
1
= (k + 1) · , k ∈ N0 .
2

3) It follows from
∞  k+1 ∞  k−1
1 1 1 1 1
E {X1 } = E {X2 } = k = k· = · = 1,
2 4 2 4 1− 1 2
k=1 k=1 2

that

  k+2 ∞  k+1 ∞  k−2
1 1 1 1
E{Z} = k(k + 1) = k(k − 1) = k(k − 1)
2 2 8 2
k=1 k=2 k=2
1 2!
= · 3 = 2.
8 1
1−
2

1
4) Roughly speaking, A and B are joined to get C, so we have N = 2 and  = . Then it follows that
2
1− 1
P {Y = 0} = p0 = = ,
1+ 3
and
 k−1
1− 1 1
P {Y = k} = 2k · = , k ∈ N.
1+ 3 2
Thus

  j−1  k  k−1
1 1 1 1 1 1 1
P {Y > k} = = · · = · , k ∈ N0 .
3 2 3 2 1 3 2
j=k+1 1−
2

5) The mean is

  k−1
1 1 1 1 4
E{Y } = k = · 2 = .
3 2 3 1 3
k=1 1−
2

Download free ebooks at bookboon.com

88
Stochastic Processes 2 4. Queueing theory

6) It follows from 2. that



  j+2 ∞  j−1
1 1  1
P {Z > k} = (j + 1) = j
2 4 2
j=k+1 j=k+2
1 k+1    k+2
1 2 k + 2 − (k + 1) 12 1
= · 2 = · {2k + 4 − k − 1}
4 1 − 12 2
 k−1  k−1
k+3 1 1 1
= · > · = P {Y > k}.
8 2 3 2

We notice that P {Y = k} = P {Y > k} for k ∈ N, and that this is not true for k = 0.
Please click the advert

Download free ebooks at bookboon.com

89
Stochastic Processes 2 4. Queueing theory

Example 4.15 Given two mutually independent queueing systems A and B. We assume for each of
the two systems,
a. there is one channel,
b. it is possible to form a queue,
1
c. customers arrive to A according to a Poisson process of intensity λA = minute−1 , and they
3
2
arrive to B according to a Poisson process of intensity λB = minute−1 ,
3
d. the service times of both A and B are exponentially distributed of the parameter μ = 1 minute −1 .
Let the random variable XA denote the number of customers in system A, and let the random variable
XB denote the number of customers in system B. Furthermore, we let Y A and YB , resp., denote the
number of customers in the queue at A and B, resp..
1. Find by using the stationary probabilities,
P {XA = k} and P {XB = k} , k ∈ N0 .

2. Find the average waiting times at A and B, resp..


3. Find by using the stationary probabilities,
P {YA = k} and P {YB = k} , k ∈ N0 .

4. Find the means E {XA + XB } and E {YA + YB }.


5. Compute P {XA + XB = k}, k ∈ N0 .
The two queueing systems are now joined to one queueing system of two channels, where the customers
arrive according to a Poisson process of intensity λ = λA + λB , and where the serving times are
exponentially distributed of parameter μ = 1 minute−1 . Let X denote the number of customers in the
system, and let Y denote the number of customers in the queue.
6. Find by using the stationary probabilities,
P {X = k} and P {Y = k}, k ∈ N0 .

7. Find the means E{X} and E{Y }.

1 1
1A. Since λA = minute−1 and μ = 1 minute−1 , and N = 1, we get the traffic intensity A = .
3 3
The stationary probabilities are
 k+1
1
P {XA = k} = pA,k = 2 · , k ∈ N0 .
3

2 2
1B. Analogously, λB = minute−1 and μ = 1 minute−1 , and N = 1, so B = , and
3 3
 k  k+1
1 2 1 2
P {XB = k} = pB,k = = , k ∈ N0 .
3 3 2 3

Download free ebooks at bookboon.com

90
Stochastic Processes 2 4. Queueing theory

2A. The waiting time at A is given by


1
A 1
VA = = 32 = .
μ (1 − A ) 1· 3 2

2B. Analogously, the waiting time at B is


2
B
VB = = 3 1 = 2.
μ (1 − B ) 1· 3

3A. Assume that there is no queue at A. Then either there is no customer at all in the system, or
there is precisely one customer, who is served for the time being,
 
1 1 8
P {YA = 0} = P {XA = 0} + P {XA = 1} = 2 · + = .
3 9 9
If k ∈ N, then
 k+2
1
P {YA = k} = P {XA = k + 1} = 2 · .
3

3B. Analogously,
 
1 2 5
P {YB = 0} = P {XB = 0} + P {XB = 1} = 1+ =
3 3 9
and
 k+2
1 2
P {YB = k} = P {XB = k + 1} = , k ∈ N0 .
2 3

4. It follows from

  k+1 ∞  k−1
1 2 1 2 1 1
E {XA } = 2 k = k = · 2 =
3 9 3 9 1 2
k=1 k=1 1−
3
and
∞  k−1
2 2 2 1
E {XB } = k = · 2 = 2,
9 3 9 2
k=1 1−
3
that
1 5
E {XA + XB } = +2= .
2 2
It follows from

  k+2 ∞  k−1
1 2  1 2 1 1
E {YA } = 2 k = k = · 2 =
3 27 3 27 1 6
k=1 k=1 1−
3

Download free ebooks at bookboon.com

91
Stochastic Processes 2 4. Queueing theory

and
∞  k+2 ∞  k−1
1 2 4  2 4 1 4
E {YB } = k = k = · 2 = ,
2 3 27 3 27 2 3
k=1 k=1 1−
3
then
1 4 3
E {YA + YB } = + = .
6 3 2

5. If k ∈ N0 , then
k

P {XA + XB = k} = P {XA = j} · P {XB = k − j}
j=0
k
  j+1  k−j+1  k  k+2
1 1 2 1
= 2· · · = · 2k−j+1
j=0
3 2 3 j=0
3
 k+2 k+1
  k+2
1 1
= 2n = 2k+2 − 2 ·
3 n=1
3
 k+2  k+2
2 1 2
= −2 = k+2 2k+1 − 1 .
3 3 3

6. The traffic intensity is


1
λA + λB +2 λ 1
= = 3 3 = = .
Nμ 2·1 2μ 2
It follows that
1 ⎧


, k = 0,
3 ⎪

P {X = k} = pk =
⎪ 2  1 k


⎩ , k ∈ N.
3 2
Since Y = (X − 2) ∨ 0, we get
1 1 1 5
P {Y = 0} = P {X = 0} + P {X = 1} + P {X = 2} = + + =
3 3 6 6
and
 k+2  k
2 1 1 2
P {Y = k} = P {X = k + 2} = = , k ∈ N.
3 2 6 3

7. By a straightforward computation,
∞  k ∞  k−1
2 1 1 1 1 1 4
E{X} = k = k = · 2 =
3 2 3 2 3 1 3
k=1 k=1 1−
2

Download free ebooks at bookboon.com

92
Stochastic Processes 2 4. Queueing theory

and
∞  k
1 1 1 1
E{Y } = k = E{X} = .
6 2 4 3
k=1

Example 4.16 Consider a birth and death process E0 , E1 , E2 , . . . , where the birth intensities λk
are given by
α
λk = , k ∈ N0 ,
k+1
where α is a positive constant, while the death intensities μk are given by

⎨ 0, k = 0,
μk = μ, k = 1,

2μ, k ≥ 2,
α
where μ > 0. We assume that = 8.
μ
1. Find the equations of the stationary probabilities pk , k ∈ N0 .

2. Prove that
1
pk = 2 · 4k · p0 , k ∈ N,
k!
and find p0 .
The above can be viewed as a model of the forming of a queue in a shop, where
a. there are two shop assistants,
1
b. the service time is exponentially distributed of mean ,
μ
c. the frequency of the arrivals is decreasing with increasing number of customers according to the
indicated formula.
3. Compute by means of the stationary probabilities the average number of customers in the shop. (3
dec.).

4. Compute by means of the stationary probabilities the average number of busy shop assistants. (3
dec.).

5. Compute by means of the stationary probabilities the probability that there are more than two
customers in the shop. (3 dec.).

1) We have

μk+1 pk+1 = λk pk , k ∈ N0 ,

Download free ebooks at bookboon.com

93
Stochastic Processes 2 4. Queueing theory

thus
λ0 α
p1 = p0 = p0 = 8p0
μ1 μ
and
λk−1 α 1 4
pk = pk−1 = · pk−1 = pk−1 for k ≥ 2.
μk k 2μ k

2) If k = 1, then

41
p1 = 8p0 = 2 · p0 ,
1!
and the formula is true for k = 1. Then assume that
1
pk−1 = 2 · 4k−1 · p0 .
(k − 1)!

Then
4 4k
pk = pk−1 = 2 · p0 ,
k k!
and the formula follows by induction.

your chance
to change
the world
Please click the advert

Here at Ericsson we have a deep rooted belief that


the innovations we make on a daily basis can have a
profound effect on making the world a better place
for people, business and society. Join us.

In Germany we are especially looking for graduates


as Integration Engineers for
• Radio Access and IP Networks
• IMS and IPTV

We are looking forward to getting your application!


To apply and for all current job openings please visit
our web page: www.ericsson.com/careers

Download free ebooks at bookboon.com

94
Stochastic Processes 2 4. Queueing theory

It follows from

 ∞

  4k
1= pk = p 0 1+2 = p0 2e4 − 1
k!
k=0 k=1

that
1
p0 = .
2e4 − 1

3) The task is now changed to queueing theory. Since pk is the probability that there are k customers
in the shop, the mean of the number of customers in the shop is

 ∞
 4k−1 8e4
kpk = 2 · 4 · p0 = 4 ≈ 4.037.
(k − 1)! 2e − 1
k=1 k=1

4) The average number of busy shop assistants is




0 · p0 + 1 · p 1 + 2 pk = p1 + 1 (1 − p0 − p1 ) = 2 − 2p0 − p1 = 2 − 2p0 − 8p0
k=2
10
= 2 − 10p0 = 2 − = 1.908.
2e4−1

5) The probability that there are more than two customers in the shop is

  
32 25
pk = 1 − p 0 − p 1 − p 2 = 1 − p 0 1 + 8 + =1− 4 ≈ 0.769.
2 2e − 1
k=3

Download free ebooks at bookboon.com

95
Stochastic Processes 2 4. Queueing theory

Example 4.17 Consider a birth and death process of the states E0 , E1 , E2 , . . . , where the birth
intensities λk are given by

2λ, k = 0,
λk =
λ, k ∈ N,

while the death intensities μk are given by



0, k = 0,
μk =
μ, k ∈ N.

λ 3
Here, λ and μ are positive constants, and we assume everywhere that = .
μ 4
1. Find the equations of the stationary probabilities, and prove that the stationary probabilities are
given by
 k
3
pk = 2 · p0 , k = 1, 2, 3, . . . ,
4
and finally, find p0 .
The above can be considered as a model of forming queues in a shop, where
a. there is one shop assistant,
1
b. the service time is exponentially distributed of mean ,
μ
c. the customers arrive according to a Poisson process of intensity 2λ. However, if there already are
customers in the shop, then half of the arriving customers will immediately leave the shop without
being served.
2. Compute by means of the stationary probabilities the average number of customers in the shop.
3. Compute by means of the stationary probabilities the average number of customers in the queue.
We now assume that instead of one shop assistant there are two shop assistants and that all arriving
customers are served (thus we have the birth intensities λk = 2λ, k ∈ N0 ).
4. Compute in this queueing system the stationary probabilities and then find the average number of
customers in the queue.

1) The equations of the stationary probabilities are

μk+1 pk+1 = λk pk , k ∈ N0 ,

thus
 1
2λ 3 3
p1 = p 0 = p0 = 2 · p0 ,
μ 2 4
and
λ 3
pk = pk−1 = pk−1 , k ≥ 2,
μ 4

Download free ebooks at bookboon.com

96
Stochastic Processes 2 4. Queueing theory

hence by recursion,
 k−1  k
3 3
pk = p1 = 2 · p0 , k ≥ 2.
4 4

We get
⎧ ⎫
∞ ∞  k ⎪
⎨ ⎪
⎬  
  3 3 1 3
1= pk = p 0 + p 0 2· = p0 1 + 2 · · = p0 1 + · 4 = 7p0 ,
4 ⎪
⎩ 4 3⎪ 2
k=0 k=1 1− ⎭
4
so
 k
1 2 3
p0 = and pk = · , k ∈ N.
7 7 4

2) Since pk is the probability that there are k customers in the shop, the average number of customer
in the shop is

 ∞  k−1
2 3 3 3 1 3 24
kpk = · k· = · 2 = · 16 = .
7 4 4 14 3 14 7
k=1 k=1 1−
4

3) If there are k customers in the queue, there must also be 1 customer, who is being served, so the
average is

 ∞  k−1
2 3 3 3 3 18
kpk+1 = · · k = · 24 = ,
7 4 4 4 28 7
k=1 k=1

where we have used the result of 2.


2λ 3
4) The traffic intensity is  = = , and since N = 2, we get
2·μ 4
 k
1− 1 2 3
p0 = = and pk = , k ∈ N.
1+ 7 7 4

We see that they are identical with the stationary probabilities found in 1..
The average length of the queue is given by (end here we get to the divergence from the previous
case)

 ∞  k ∞  k+2  3 ∞  k−1
2 3 2 3 2 3 3
(k − 2)pk = (k − 2) = k = · k
7 4 7 4 7 4 4
k=3 k=3 k=1 k=1
 3
2 3 1 2 27 27
= · · 2 = · = .
7 4 3 7 4 14
1−
4

Download free ebooks at bookboon.com

97
Stochastic Processes 2 4. Queueing theory

Example 4.18 Consider a birth and death process of states E0 , E1 , E2 , . . . , and with birth intensities
λk given by


⎨ α, k = 0, 1,
λk =

⎩ α, k ≥ 2,
k
where α is a positive constant, and where the death intensities are given by

0, k = 0,
μk =
μ, k ∈ N,
where μ > 0.
α
We assume in the following that = 2.
μ
1. Find the equations of the stationary probabilities pk , k ∈ N0 .
2. Prove that
2k
pk = p0 , k ∈ N,
(k − 1)!
and find p0 .
The above can be considered as a model of forming a queue in a shop where
a. there is one shop assistant,
1
b. the serving time is exponentially distribution of mean ,
μ
c. the frequency of arrivals decreasis with increasing number of customers according to the formula
for λk above.
3. Compute by means of the stationary probabilities the average length of the queue (3 dec.).
4. Compute by means of the stationary probabilities the average number of customers in the shop (3
dec.).

1) We have


μk+1 pk+1 = λk pk , k ∈ N0 , and pk = 1.
k=0

Hence, successively,
α
μp1 = αp0 , μp2 = αp1 , and μpk = pk−1 for k ≥ 3.
k−1
α
It follows from = 2 that
μ


2
(6) p1 = 2p0 , p2 = 2p1 , pk = pk−1 , k ≥ 3, and pk = 1.
k−1
k=0

Download free ebooks at bookboon.com

98
Stochastic Processes 2 4. Queueing theory

2) We infer from (6) that p1 = 2p0 and p2 = 2p2 = 4p0 , and for k ≥ 3,

2 22 2k−2 22
pk = pk−1 = pk−2 = · · · = p2 = p0 .
k−1 (k − 1)(k − 2) (k − 1)! (k − 1)!

A check shows that the latter formula is also true for k = 1 and k = 2, thus

2k
pk = p0 , k ∈ N.
(k − 1)!

Then we find p0 from



 ∞
  ∞

  2k  2k−1
1= pk = p 0 1 + = p0 1+2 = p0 1 + 2e2 ,
(k − 1)! (k − 1)!
k=0 k=1 k=1

thus
1 2k 1
p0 = (≈ 0.0634), and pk = · , k ∈ N.
1 + 2e2 (k − 1)! 1 + 2e2

what‘s missing in this equation?


Please click the advert

You could be one of our future talents

MAERSK INTERNATIONAL TECHNOLOGY & SCIENCE PROGRAMME


Are you about to graduate as an engineer or geoscientist? Or have you already graduated?
If so, there may be an exciting future for you with A.P. Moller - Maersk.

www.maersk.com/mitas

Download free ebooks at bookboon.com

99
Stochastic Processes 2 4. Queueing theory

3) The average length of the queue is (notice that since 1 customer is served, we have here k − 1
instead of k),

 ∞
 ∞

k−1 k 2k−2 4e2
(k − 1)pk = 2 p0 = 4 p0 = 4e2 p0 = ≈ 1.873.
(k − 1)! (k − 2)! 1 + 2e2
k=2 k=2 k=2

4) The average number of customers is



 ∞
 ∞

kpk = (k − 1)pk + pk = 4e2 p0 + (1 − p0 )
k=1 k=1 (2) k=1

4e2 2e2 6e2


= + = ≈ 2.810.
1 + 2e2 1 + 2e2 1 + 2e2

Example 4.19 Given a queueing system, for which


a. there is one shop assistant,
b. it is possible to form a queue,
c. the customers arrive according to a Poisson process of intensity λ,
d. the serving times are exponentially distributed of parameter μ,
λ 2
e. the traffic intensity is .
μ 3
Let the random variable X denote the number of customers in the system, and let Y denote the number
of customers in the queue.
1. Find by means of the stationary probabilities,

P {X = k} and P {Y = k}, k ∈ N0 .

2. Find the means E{X} and E{Y }.


The system is changed by introducing another shop assistant, whenever there are 3 or more customers
in the shop; this extra shop assistant is withdrawn after ending his service, if the number of customers
then is smaller than 3. The other assumptions are unchanged.
3. Explain why this new system can be described by a birth and death process of states E 0 , E1 , E2 ,
. . . , birth intensities λk = λ, k ∈ N0 , and death intensities μk given by

⎨ 0, k = 0,
μk = μ, k = 1, 2,

2μ, k = 3, 4, . . . .

4. Find the stationary probabilities pk of this system.


5. Find the average number of customers in the system,


kpk .
k=1

Download free ebooks at bookboon.com

100
Stochastic Processes 2 4. Queueing theory

1) Since N = 1, it follows that


 k
1 2
pk = , k ∈ N0 ,
3 3
thus
 k
1 2
P {X = k} = pk = , k ∈ N0 ,
3 3
and
 
1 2 5
P {Y = 0} = P {X = 0} + P {X = 1} = 1+ = ,
3 3 9
 k+1
1 2
P {Y = k} = P {X = k + 1} = , k ∈ N.
3 3

2) The means are



 ∞  k−1
1 2 2
E{X} = kpk = · k· = 2,
3 3 3
k=1 k=1

and

  k+1
1 2 2 4
E{Y } = k = E{X} = .
3 3 3 3
k=1

3) The birth intensities λk = λ, k ∈ N0 , are clearly not changed, and μ0 = 0, μ1 = μ2 = μ. When


k ≥ 3, another shop assistant is also serving the customers, so μk = 2μ for k ≥ 3.
4) We have

μk+1 pk+1 = λk pk .

Thus we get the equations


λ 2 λ 2
p1 = p0 = p 0 , p2 = p 1 = p1 ,
μ 3 μ 3
and
λ 1
pk+1 = p k = pk , k ≥ 2.
2μ 3
Hence
2 4
p1 = p0 , p2 = p0 ,
3 9
and
 k−2  k
1 1
pk = p2 = 3 p0 for k ≥ 3.
3 3

Download free ebooks at bookboon.com

101
Stochastic Processes 2 4. Queueing theory

It follows from
⎧ ⎫ ⎧ ⎫
 ∞  k  ∞  j ⎬ ⎪ ⎪

 2 4  1 ⎨5 4  1 ⎨5 4 1 ⎬
1 = p0 = p 0 1+ + +4 = p0 + = p0 + ·
3 9 3 ⎩3 9 3 ⎭ ⎪
⎩3 9 1⎪⎭
k=0 k=3 j=0 1−
    3
5 4 3 5 2 7
= p0 + · = p0 + = p0 ,
3 9 2 3 3 3

that
3 2 4
p0 = , p1 = , p2 = ,
7 7 21
and
 k−1
4 1
pk = · , k ≥ 3.
7 3

5) The average number of customers is

∞ ∞  k−1 ∞   
 2 8 4 1 6+8 4  1
k−1
2
kpk = + + k· = + k −1−
7 21 7 3 221 7 3 3
k=1 k=3 k=1
⎧ ⎫

⎪ ⎪


⎨ ⎪  
2 4 1 5⎬ 2 4 9 5 2 4 27 − 20 2 1
= +  2 − = + − = + · = + = 1.
3 7⎪ ⎪ 1 3 ⎪
⎪ 3 7 4 3 3 7 4 · 3 3 3

⎩ 1− ⎪

3

Turning a challenge into a learning curve.


Just another day at the office for a high performer.

Accenture Boot Camp – your toughest test yet


Please click the advert

Choose Accenture for a career where the variety of opportunities and challenges allows you to make a
difference every day. A place where you can develop your potential and grow professionally, working
alongside talented colleagues. The only place where you can learn from our unrivalled experience, while
helping our global clients achieve high performance. If this is your idea of a typical working day, then
Accenture is the place to be.

It all starts at Boot Camp. It’s 48 hours packed with intellectual challenges and intense learning experience.
that will stimulate your mind and and activities designed to let you It could be your toughest test yet,
enhance your career prospects. You’ll discover what it really means to be a which is exactly what will make it
spend time with other students, top high performer in business. We can’t your biggest opportunity.
Accenture Consultants and special tell you everything about Boot Camp,
guests. An inspirational two days but expect a fast-paced, exhilarating Find out more and apply online.

Visit accenture.com/bootcamp

Download free ebooks at bookboon.com

102
Stochastic Processes 2 4. Queueing theory

Example 4.20 Given a queueing for which

a. there is one channel,

b. there is the possibility of an (unlimited) queue,

c. the customers arrive according to a Poisson process of intensity λ,

d. the service times are exponentially distributed of parameter μ,


λ 4
e. the traffic intensity is .
μ 5
Let the random variable X denote the number of customers in the system.

1. Find by using the stationary probabilities,

P {X = k} and P {X > k}, k ∈ N0 .

2. Find the mean E{X}.


We then change the system, such that there is only room for at most 3 waiting customers, thus only
room for 4 customers in total in the system (1 being served and 3 waiting). The other conditions are
unchanged. This system can be described by a birth and death process of the states E 0 , E1 , E2 , E3 ,
E4 and

λ, k = 0, 1, 2, 3,
birth intensities: λk =
0, k = 4,

0, k = 0,
death intensities: μk =
μ, k = 1, 2, 3, 4.
Let the random variable Y denote the number of customers in this system.

3. Find by means of the stationary probabilities,

P {Y = k}, k = 0, 1, 2, 3, 4, (3 dec.).

4. Find the means E{Y } (3 dec.).

Now the intensity of arrivals λ is doubled, while the other assumptions are the same as above. This will
imply that the probability of rejection becomes too big, so one decides to hire another shop assistant.
Then the system can be described by a birth and death process with states E 0 , E1 , E2 , E3 , E4 , E5 ,
(where E5 corresponds to 2 customers being served and 3 waiting).
5. Find the equations of this system of the stationary probabilities p0 , p1 , p2 , p3 , p4 , p5 .

6. find the stationary probabilities (3 dec.).

1) We have
 k
k 1 4
P {X = k} = pk =  (1 − ) = , k ∈ N0 ,
5 5

Download free ebooks at bookboon.com

103
Stochastic Processes 2 4. Queueing theory

hence
 k+1
4

  j  k+1
1 4 1 5 4
P {X > k} = = · = , k ∈ N0 .
5 5 5 4 5
j=k+1 1−
5

2) The mean is
∞  k−1
1 4 4 4 1
E{X} = · k = · 2 = 4.
5 5 5 25 4
k=1 1−
5

3) It follows from

μk+1 pk+1 = λk pk ,

that
 2
λ 4 4
p1 = p0 = p0 , p2 = p0 ,
μ 5 5
 3  4
4 4
p3 = p0 , p4 = p0 ,
5 5
hence
 5
 4
 2  3  4  1−   4 
4 4 4 4 5 4
1 = p0 1+ + + + = p0 · = p0 5 − 4 · ,
5 5 5 5 4 5
1−
5
and
1
P {Y = 0} = p0 = 4 ≈ 0.297,
5 − 4 45

4
P {Y = 1} = p1 = p0 ≈ 0.238,
5
4
P {Y = 2} = p2 = p1 ≈ 0.190,
5
4
P {Y = 3} = p3 = p2 ≈ 0.152,
5
4
P {Y = 4} = p4 = p3 ≈ 0.122.
5

4) The mean is
  2  3  4 
4 4 4 4
E{Y } = 1 · p1 + 2p2 + 3p3 + 4p4 = +2 +3 +4 p0 ≈ 1.563.
5 5 5 5

Download free ebooks at bookboon.com

104
Stochastic Processes 2 4. Queueing theory

5) The birth intensities are



2λ, k = 0, 1, 2, 3, 4,
λk =
0, k = 5,

and the death intensities are



⎨ 0, k = 0,
μk = μ, k = 1,

2μ, k = 2, 3, 4, 5.

It follows from

μk+1 pk+1 = λk pk ,

that
2λ 8
p1 = p0 = ,
μ 5
and
2λ 4
pk = pk−1 = pk−1 for k = 2, 3, 4, 5.
2μ 5

6) Now
 k
4
pk = 2 p0 for k = 1, 2, 3, 4, 5,
5

thus
⎧  5 ⎫
   ⎪
⎪ 4 ⎪ ⎪
 2  3  4 ⎪
⎨ 1− ⎪
8 4 4 4 4 8 5 ⎬
1 = p0 1 + 1+ + + + = p0 1 + ·
5 5 5 5 5 ⎪
⎪ 5 4 ⎪ ⎪

⎩ 1− ⎪
5 ⎭
  5 
4
= p0 9 − 8 ,
5

Download free ebooks at bookboon.com

105
Stochastic Processes 2 4. Queueing theory

and hence
1
p0 =  5 ≈ 0, 157,
4
9−8
5

4
p1 = 2· p0 ≈ 0.251,
5
4
p2 = p1 ≈ 0.201,
5
4
p3 = p2 ≈ 0.161,
5
4
p4 = p3 ≈ 0.128,
5
4
p5 = p4 ≈ 0.103.
5



    
  
Please click the advert

In Paris or Online
International programs taught by professors and professionals from all over the world

BBA in Global Business


MBA in International Management / International Marketing
DBA in International Business / International Management
MA in International Education
MA in Cross-Cultural Communication
MA in Foreign Languages

Innovative – Practical – Flexible – Affordable

Visit: www.HorizonsUniversity.org
Write: [email protected]
Call: 01.42.77.20.66 www.HorizonsUniversity.org

Download free ebooks at bookboon.com

106
Stochastic Processes 2 4. Queueing theory

Example 4.21 Given two queueing systems A and B, which are independent of each other. We
assume for each of the systems,
a. there is one shop assistant,
b. It is possible to have a queue,
3
c. customers are arriving to A according a Poisson process of intensity λ A = minute−1 , and to B
4
1 −1
according to a Poisson process of intensity λB = minute ,
2
d. the service times at both A and B are exponentially distributed of parameter μ = 1 minute −1 .
Let the random variable XA denote the number of customers in system A, and let XB denote the
number of customers in system B.
1. Find by means of the stationary probabilities,

P {XA = k} and P {XB = k} , k ∈ N0 .

2. Find the average waiting times at A and B, resp..


3. Compute the probabilities P {XB > k}, k ∈ N0 , and then find

P {XA < XB } .

The arrivals of the customers at A is now increased, such that the customers arrive according to a
Poisson process of intensity 1 minute−1 . For that reason the two systems are joined to one queueing
system with two shop assistants, thus the customers now arrive according to a Poisson process of
intensity
 
1 3
λ= 1+ minute−1 = minute−1 ,
2 2

and the service times are still exponentially distributed with the parameter

μ = 1 minut−1 .

Let Y denote the number of customers in this new system.


4. Find by means of the stationary probabilities,

P {Y = k}, k ∈ N0 .

5. Prove that the average number of customers in the new system, E{Y }, is smaller than E {X A + XB }.

λA 3
1A. We get from A = = and N = 1 that
μ 4
 k
1 3
P {XA = k} = pA,k = , k ∈ N0 .
4 4

Download free ebooks at bookboon.com

107
Stochastic Processes 2 4. Queueing theory

1
1B. Analogously, B = , so
2
 k+1
1
P {XB = k} = pB,k = , k ∈ N0 .
2

2. Since N = 1, the waiting times are


3
A 4 B
VA = = =3 and VB = = 1.
μ (1 − A ) 1 μ (1 − B )

4
3. We get
 k+2
1
∞  j+1
  k+1
1 2 1
P {XB > k} = = = , k ∈ N0 ,
2 1 2
j=k+1 1−
2
so

 ∞
  k  k+1
1 3 1
P {XA < XB } = P {XA = k} · P {XB > k} = ·
4 4 2
k=0 k=0
∞ 
 k
1 3 1 1 1
= = · = .
8 8 8 3 5
k=0 1−
8

The new traffic intensity is


3
λ 2 3
= = = ,
2μ 2·1 4
and since N = 2, we get
 k
1− 1 1− 2 3
p0 = = , pk = 2k · = , k ∈ N,
1+ 7 1+ 7 4
thus
 k
1 2 3
P {Y = 0} = and P {Y = k} = , k ∈ N.
7 7 4

Then
∞  k−1
2 3 3 3 1 3 · 16 24
E{Y } = · k = · 2 = = ,
7 4 4 14 3 14 7
k=1 1−
4
and
∞  k−1
1 3 3 3
E {XA } = · k = · 16 = 3,
4 4 4 16
k=1

Download free ebooks at bookboon.com

108
Stochastic Processes 2 4. Queueing theory

and
∞  k−1
1 1 1 1
E {XB } = k = · 2 = 1,
4 2 4 1
k=1 1−
2
hence
24
E {XA + XB } = 3 + 1 = 4 > = E{Y }.
7

Example 4.22 Given two independent queueing systems A and B, where we assume for each of them,
a. there is one shop assistant,
b. it is possible to create a queue,
3
c. the customers arrive according to a Poisson process of intensity λ = min−1 ,
5
d. the service times are exponentially distributed of parameter μ = 1 min −1 .
Let the random variable XA denote the number of customers in system A, and let XB denote the
number of customers in system B, and put Z = XA + XB .
1. Compute by means of the stationary probabilities,
P {XA = k} and P {XB = k} , k ∈ N0 .

2. Find the means E {XA }, E {XB } and E{Z}.


3. Compute P {Z = k}, k ∈ N0 .
The number of arrivals of customers to A is increased, so the customers are arriving according
to a Poisson process of intensity 1 minute−1 . Therefore, the two systems are joined to one sys-
tem with two shop
 assistants, so the customers now arrive according to a Poisson process of in-
3
tensity 1 + minute−1 , and the service times are still exponentially distributed of parameter
5
μ = 1 minute−1 .
Let Y denote the number of customers in this system.
4. Compute by means of the stationary probabilities,
P {Y = k} and P {Y > k}, k ∈ N0 .

5. Find the mean E{Y }.

1) The traffic intensities are


λ 3
A = B = = ,
N ·μ 5
and since N = 1, we get
 k
2 3
P {XA = k} = P {XB = k} = , k ∈ N0 .
5 5

Download free ebooks at bookboon.com

109
Stochastic Processes 2 4. Queueing theory

2) The means are


∞  k−1
2 3 3 6 1 6 3
E {XA } = E {XB } = · k = · 2 = = ,
5 5 5 25 3 4 2
k=1 1−
5

thus

E{Z} = E {XA } + E {XB } = 3.

3) The probabilities are

k
 k  j  k−j
2 3 2 3
P {Z = k} = P {XA = j} · P {XB = k − j} = ·
j=0 j=0
5 5 5 5
 k
4 3
= (k + 1) , k ∈ N0 .
25 5

Brain power By 2020, wind could provide one-tenth of our planet’s


electricity needs. Already today, SKF’s innovative know-
how is crucial to running a large proportion of the
world’s wind turbines.
Up to 25 % of the generating costs relate to mainte-
nance. These can be reduced dramatically thanks to our
systems for on-line condition monitoring and automatic
lubrication. We help make it more economical to create
Please click the advert

cleaner, cheaper energy out of thin air.


By sharing our experience, expertise, and creativity,
industries can boost performance beyond expectations.
Therefore we need the best employees who can
meet this challenge!

The Power of Knowledge Engineering

Plug into The Power of Knowledge Engineering.


Visit us at www.skf.com/knowledge

Download free ebooks at bookboon.com

110
Stochastic Processes 2 4. Queueing theory

4) The traffic intensity of the new system is


3
1+
=
λ
= 5 = 4,
N ·μ 2·1 5
and since N = 2, we get
 k
1− 1 1− 2 4
p0 = = and pk = 2k = , k ∈ N.
1+ 9 1+ 9 5

Thus
 k
1 2 4
P {Y = 0} = and P {Y = k} = , k ∈ N,
9 9 5

and hence
 k+1
4
∞ ∞  j  k
2  4 2 5 8 4
P {Y > k} = P {Y = j} = = · = .
9 5 9 4 9 5
j=k+1 j=k+1 1−
5

Download free ebooks at bookboon.com

111
Stochastic Processes 2 4. Queueing theory

5) The mean is
∞  k−1
2 4 4 8 1 40
E{Y } = · k = · 2 = .
9 5 5 45 4 9
k=1 1−
5

Example 4.23 Given a queueing system, for which


a. There are two shop assistants.
b. The customers arrive according to a Poisson process of intensity λ = 3 min −1 .
c. The service times are exponentially distributed of parameter μ = 2 min −1 .
d. It is possible to queue up.
1. Find the stationary probabilities.
2. Find by means of the stationary probabilities the probability that we have more than two customers
in the shop.
3. Find by means of the stationary probabilities the average length of the queue.
Then chance the system, such that it becomes a rejection system, while the other assumptions a.–c.
are unchanged.
4. Find the probability of rejection of this system.

1) We get from
λ = 3, μ = 2 and N = 2,
that the traffic intensity is
λ 3 3
= = = .
N ·μ 2·2 4
From N = 2 we find the pk by a known formula,.
 k
1− 1 k 1− 2 3
p0 = = and pk = 2 · = · , k ∈ N.
1+ 7 1+ 7 4
In particular,
2 3 3 2 9 9
p1 = · = and p2 = · = .
7 4 14 7 16 56
2) The probability that there are more than two customers in the shop is

 8 + 12 + 9 29 27
pk = 1 − p 0 − p 1 − p 2 = 1 − =1− = .
56 56 56
k=3

Alternatively,

 ∞  k  3
2 3 2 3 1 2 3·3·3·4 27
pk = = · = · = .
7 4 7 4 3 7 4·4·4 56
k=3 k=3 1−
4

Download free ebooks at bookboon.com

112
Stochastic Processes 2 4. Queueing theory

3) The average length of the queue is again given by a known formula,



  3 
∞  k−3  3
2 3 3 2 3 1 27
(k − 2)pk = · (k − 2) = · · 2 = .
7 4 4 7 4 3 14
k=3 k=3 1−
4

4) The probability of rejection is p2 , because N = 2. It is given by some known formula in any


textbook,
 2
3 1 9
·
2 2! 8 9 9
p2 =   = = = .

2 1 3 j 1+
3 9
+
8 + 12 + 9 29
j=0
j! 2 2 8

Example 4.24 Given a queueing system, for which


a. There are two shop assistants.

b. The customers arrive according to a Poisson process of intensity λ = 5 quarter−1 .

c. The service times are exponentially distributed of parameter μ = 3 quarter−1 .
d. It is possible for queue up.
1. Prove that the stationary probabilities are given by
⎧ 1

⎪ , k = 0,

⎨ 11
pk =  k




2 5
, k > 0.
11 6

2. Find by means of the stationary probabilities the average waiting time.


3. Find by means of the stationary probabilities the average length of the queue.
Then the service is rationalized, such that the average service time is halved. At the same time one
removes one of the shop assistants for other work in the shop.
4. Check if the average waiting time is bigger or smaller in the new system than in the old system.

1) It follows from N = 2, λ = 5 and μ = 3 that the traffic intensity is


λ 5 5
= = = .
Nμ 2·3 6
Since N = 2, we may use a known formula, so
 k
1− 1 k 1 5
p0 = = and pk = 2 p0 = ,
1+ 11 11 6

Download free ebooks at bookboon.com

113
Stochastic Processes 2 4. Queueing theory

and hence
⎧ 1

⎪ , k = 0,

⎨ 11
pk =  k




2
·
5
, k ∈ N.
11 6

2) The average waiting time V is again found by some known formula,


 2
5 1
· 21 ·
p0 N · N N −1 6 11 52 · 2 25
V = =   2 = = quarter.
μ · N !(1 − )2 1 11 · 3 · 2 33
3·2·
6

3) Also the average length of the queue is found by a given formula,



 ∞
  k  3 
∞  k−1
5 2 2 5 5
(k − 2)pk = (k − 2) · = · k
6 11 11 6 6
k=3 k=3 k=1
 3 3
2 5 1 2·5 125
= · · 2 = = (= λ V ).
11 6 5 11 · 6 33
1−
6
Please click the advert

The financial industry needs a strong software platform


That’s why we need you
SimCorp is a leading provider of software solutions for the financial industry. We work together to reach a common goal: to help our clients
succeed by providing a strong, scalable IT platform that enables growth, while mitigating risk and reducing cost. At SimCorp, we value
commitment and enable you to make the most of your ambitions and potential.
Find your next challenge at
Are you among the best qualified in finance, economics, IT or mathematics? www.simcorp.com/careers

www.simcorp.com

MITIGATE RISK REDUCE COST ENABLE GROWTH

Download free ebooks at bookboon.com

114
Stochastic Processes 2 4. Queueing theory

5
4) We have in the new system that N = 1, λ = 5, μ = 6 and  = .
6
Then the average waiting time is because N = 1 given by a known formula,
5
 6 5
V = = = quarter.
μ(1 − ) 1 6

5
It is seen that the average waiting time is larger in the new system than in the old one.

Example 4.25 Given a queueing system, for which


a. There are two shop assistants.

b. The customers arrive according to a Poisson process of intensity λ = 8 quarter−1 .

c. The service times are exponentially distributed of parameter μ = 6 quarter−1 .
d. It is possible to queue up.
1. Prove that the stationary probabilities are given by
⎧ 1

⎪ , k = 0,

⎨ 5
Pk =  k




2 2
, k ∈ N.
5 3

2. Find by means of the stationary probabilities the average number of customers in the shop.
3. Find by means of the stationary probabilities the average waiting time.
4. Find by means of the stationary probabilities the probability that both shop assistants are busy.
5. Find the median in the stationary distribution.

1) The traffic intensity is


λ 8 2
= = = .
Nμ 2·6 3
Then by a known formula,
 k
1− 1 k 2 2
p0 = = , pk = 2 p0 = , k ∈ N.
1+ 5 5 3

2) By computing the mean it follows that the average number of customers is



 ∞
  k ∞  k−1
2 2 2 2 2 4 1 4 9 12
kpk = k· = · k· = · 2 = · = .
5 3 5 3 3 15 2 15 1 5
k=1 k=1 k=1 1−
3

Download free ebooks at bookboon.com

115
Stochastic Processes 2 4. Queueing theory

3) The average waiting time is also found by a standard formula,


 2
2 1
·2 ·
p0 2 · 2 3 4 5 2
V = =  2 = = quarter (= 2 minutes).
μ · 2 · (1 − )2 1 5 · 6 15
6·2·
3

Supplement. The average length of queue is also easily found by some known formula,

 ∞
  k  ∞  −1  3  3
2 2 2 2 2 1 2 2
(k − 2)pk = (k − 2) ·
= · · = 2 · ·
3 5 3 5 3 2 5 3
k=3 k=3 =1 1−
3
3 2
2·2 ·5 16 2
= = = λV = 8 · .
5 · 33 15 15

4) The complementary event: Both shop assistants are busy with the probability
 
1 4 7 8
1 − (p0 + p1 ) = 1 − + =1− = .
5 14 15 15

Alternatively, the probability is given by



 ∞
  k  3
2 2 2 2 8
pk = = · ·3= .
5 3 5 3 15
k=2 k=2

5) The distribution is discrete, and



 8 1
pk = > ,
15 2
k=2

cf. 4.. Thus


1 4 8
p0 = , p1 = , p2 = .
5 15 45
Finally,

 8 1
P {X ≥ 2} = pk = > ,
15 2
k=2

and
1 4 8 9 + 12 + 8 29 1
P {X ≤ 2} = p0 + p1 + p2 = + + = = > .
5 15 45 45 45 2
1
Since both probabilities are ≥ , the median is (X) = 2.
2

Download free ebooks at bookboon.com

116
Stochastic Processes 2 5. Other types of stochastic processes

5 Other types of stochastic processes


Example 5.1 An aeroplane has 4 engines (2 on each wing), and it can carry through a flight if just
1 motor from each wing is working. At start (t = 0) all 4 engines are intact, but they may break down
during the flight. We assume (as a crude approximation) that the operating times of the 4 engines are
1
mutually independent and exponentially distributed of mean (which hopefully is much larger than
λ
the flight time). The system can be described as a Markov process of 4 states:

E4 : all 4 engines are working,

E3 : 3 engines are working,

E2 : 1 engine in each wing is working,

E1 : the aeroplane has crashed.

1. Derive the system of differential equations of the probabilities

Pi (t) = P {the process is in state Ei at time t} , i = 1, 2, 3, 4.

(Notice that this is not a birth and death process, because the probability of transition from E 3 to
E1 in a small time interval of length h is almost proportional to h.)
2. Find Pi (t), i = 1, 2, 3, 4.

1) It follows from the diagram


4λ 2λ 2λ
E4 −→ E3 −→ E2 −→ E1

λ
E3 → E1

that we have the conditions

P4 (t + h) = (1 − 4λh)P4 (t) + hε(h),


P3 (t + 4) = (1 − 3λh)P3 (t) + 4λhP4 (t) + hε(h),
P2 (t + h) = (1 − 2λh)P2 (t) + 2λhP3 (t) + hε(h),
P1 (t + h) = P (t) + 2λhP2 (t) + λhP3 (t) + hε(h),

hence by a rearrangement and taking the limit h → 0 we get the system of differential equations,
⎧ 

⎪ P4 (t) = −4λP4 (t), P4 (0) = 1,






⎨ P3 (t) = −3λP3 (t) + 4λP4 (t), P3 (0) = 0,



⎪ P2 (t) = −2λP2 (t) + 2λP3 (t), P2 (0) = 0,




⎩ 
P1 (t) = 2λP2 (t) + λP3 (t), P1 (0) = 0.

Download free ebooks at bookboon.com

117
Stochastic Processes 2 5. Other types of stochastic processes

0.8

0.6

0.4

0.2

0 0.5 1 1.5 2 2.5 3

Figure 3: The graphs of P1 (t), . . . , P4 (t) for λ = 1.

2) It follows immediately that


P4 (t) = e−4λt .
By insertion into the next differential equation we get
P3 (t) + 3λP3 (t) = 4λe−4λt ,
hence
 t  t
−3λt 3λτ −4λτ −3λt

P3 (t) = e e · 4λe dτ = e 4λe−λτ dτ = e−3λt 4 − 4e−λt = 4e−3λt − 4e−4λt .
0 0

Then by insertion into the next equation and a rearrangement,


P2 (t) + 2λP2 (t) = 8λe−3λt − 8λe−4λt ,
the solution of which is
 t  t
   
P2 (t) = e−2λt e2λτ 8λe−3λτ − 8λe−4λτ dτ = e−2λt 8λe−λτ − 8λe−2λτ dτ
0 0
= e−2λt 4 − 8e−λt + 4e−λt = 4e−2λt − 8e−3λt + 4e−4λt .
Finally, P1 (t) is found from the condition
4

Pk (t) = 1, thus P1 (t) = 1 − P2 (t) − P3 (t) − P4 (t),
k=1

and we get summing up,


P4 (t) = e−4λt ,
P3 (t) = 4e−3λt − 4e−4λt ,
P2 (t) = 4e−2λt − 8e−3λt + 4e−4λt ,
P1 (t) = 1 − 4e−2λt + 4e−3λt − e−4λt .

Download free ebooks at bookboon.com

118
Stochastic Processes 2 5. Other types of stochastic processes

Example 5.2 Let Y and Z be independent N (0, 1) distributed random variables, and let the process
{X(t), t ∈ R} be defined by

X(t) = Y cos t + Z sin t.

Find the mean value function m(t) and the autocorrelation R(s, t).

The mean value function is

m(t) = E{X(t)} = E{Y cos t} + E{Z sin t} = cos t · E{Y } + sin t · E{Z} = 0.

The autocorrelation is

R(s, t) = E{X(s)X(t)} = E{(Y cos s + Z sin s)(Y cos t + Z sin t)}


   
= cos s · cos t · E Y 2 + sin s · sin t · E Z 2 + (cos s · sin t + sin s · cos t)E{Y Z}
     
= cos s · cos t · E Y 2 + sin s · sin t · E{Y 2 } + 0 E Z 2 = E Y 2

= cos(s − t) V {Y } + (E{Y })2 = cos(s − t).
Please click the advert

Download free ebooks at bookboon.com

119
Stochastic Processes 2 5. Other types of stochastic processes

Example 5.3 Let {X(t), t ≥ 0} denote a Poisson process of intensity a, and let {Y (t), t ≥ 0} be
given by

Y (t) = X(t + 1) − X(t).

Compute the mean value function and the autocovariance of {Y (t), t ≥ 0}.

We have
(at)n −at
P {X(t) = n} = e , n ∈ N0 .
n!
The mean value function is obtained by first noticing that
an −a
P {T (t) = n} = P {X(t + 1) − X(t) = n} = P {X(1) = n} = e ,
n!
thus Y (t) = X(1), (The Poisson process is “forgetful”) and

 an −λ
m(t) = E{Y (t)} = n e = a.
n=1
n!

If s ≤ t, then
Cov(Y (s), Y (t)) = Cov(X(s + 1) − X(s), X(t + 1) − X(t)) = a · (s + 1 − min{s + 1, t} − s + s)
= a (s + 1 − min{s + 1, t}).
If therefore s + 1 ≤ t, then

Cov(Y (s), Y (t)) = 0,

and if s + 1 > 1, then

Cov(Y (s), T (t)) = a{s + 1 − t}.

Summing up,

⎨ a{1 − |s − t|}, for |s − t| < 1,
Cov(Y (s), Y (t)) =

0, for |s − t| ≥ 1.

Download free ebooks at bookboon.com

120
Stochastic Processes 2 5. Other types of stochastic processes

Example 5.4 Let X1 and X2 be independent random variables, both normally distributed of mean 0
and variance σ 2 . We define a stochastic process {X(t), t ∈ R} by

X(t) = X1 sin t + X2 cos t.

1) Find the mean value function m(t) and the autocorrelation R(s, t).

2) Prove that the process {X(t), t ∈ R} is weakly stationary.

3) Find the values of s − t, for which the random variables X(s) and X(t) are non-correlated.
4) Given the random variables X(s) and X(t), where s − t is fixed as above. Are X(s) and X(t)
independent?

1) The mean value function is

m(t) = E{X(t)} = sin t · E {X1 } + cos t · E {X2 } = 0.

The autocorrelation is

R(s, t) = E{X(s)X(t)} = E {(X1 sin s + X2 cos s) (X1 sin t + X2 cos t)}


  
= sin s · sin t · E X12 + cos s · cos t · E X22 + (· · · ) · E {X1 X2 }
   
= sin s · sin t V {X1 } + E X12 + cos s · cos t V {X2 } + E X22 + 0
= (cos s · cos t + sin s · sin t)σ 2 = cos(s − t) · σ 2 .

2) A stochastic process is weakly stationary, if m(t) = m is constant, and C(s, t) = C(s − t). In the
specific case,

m(t) = 0 = m,

and

C(s, t) = Cov{X(s), X(t)} = E{X(s)X(t)} − E{X(s)} · E{X(t)}


= R(s, t) − m(s)m(t) = σ 2 cos(s − t),

and we have proved that the process is weakly stationary.


3) It follows from

Cov{X(s), X(t)} = C(s, t) = σ 2 cos(s − t),

that X(s) and X(t) are non-correlated, if


π
s=t+ + pπ, p ∈ Z,
2
i.e. if
π
s−t= + pπ, p ∈ Z.
2

Download free ebooks at bookboon.com

121
Stochastic Processes 2 5. Other types of stochastic processes

π
4) Since (X(s), X(t)) with s − t = + pπ, p ∈ Z, follows a two-dimensional normal distribution, and
2
X(s) and X(t) are non-correlated, we conclude that they are independent.

Example 5.5 Let {X(t), t ∈ R} be a stationary process of mean 0, autocorrelation R(τ ) and effect
spectrum S(ω).
Let {Y (t), t ∈ R} be defined by

Y (t) = X(t + a) − X(t − a), where a > 0.

Express the autocorrelation and the effect spectrum of {Y (t)} by the corresponding expressions of
{X(t)} (and a).

The assumptions are


 ∞
m(t) = 0, R(τ ) = E{X(t + τ )X(t)} and S(ω) = eiωτ R(τ ) dτ.
−∞

Try this...
Please click the advert

Challenging? Not challenging? Try more www.alloptions.nl/life


Download free ebooks at bookboon.com

122
Stochastic Processes 2 5. Other types of stochastic processes

Hence for Y (t) = X(t + a) − X(t − a), a > 0,

RY (τ ) = E{Y (t + τ )Y (t)} = E{[X(t + τ + a) − X(t + τ − a)] · [X(t + a) − X(t − a)]}


= E{X(t + τ + a)X(t + a)} − E{X(t + τ + a)X(t − a)}
−E{X(t + τ − a)X(t + a)} + E{X(t + τ − a)X(t − a)}
= RX (τ ) − RX (τ + 2a) − RX (τ − 2a) + RX (τ )
= 2RX (τ ) − RX (τ + 2a) − RX (τ − 2a),

so
 ∞
SY (ω) = eiωτ RY (τ ) dτ
−∞
 ∞  ∞  ∞
iωτ iωτ
= 2 e RX (τ ) dτ − e RX (τ + 2a) dτ − eiωτ RX (τ − 2a) dτ
−∞ −∞ −∞
= 2SX (ω) − e−2iaω SX (ω) − e 2iaω
SX (ω) = 2{1 − cos 2aω}SX (ω)
2
= 4 sin aω SX (ω).

Download free ebooks at bookboon.com

123
Stochastic Processes 2 5. Other types of stochastic processes

Example 5.6 Let {X(t), t ∈ R} be a stationary process of mean 0 and effect spectrum S(ω), and let
n
1
Y = X(kT ), hvor T > 0.
n
k=1

Prove that


1
∞ sin2
nωT
  1 2
E Y2 = S(ω) ·   dω.
2πn2 −∞ 2 1
sin ωT
2
Hint:
 
2 1
sin nωT n−1

2
  = (n − |m|)e−iωmT .
1
sin2 ωT m=−(n−1)
2

First compute
 n
n 

  1 
2
E Y = E X(kT )X(mT )
n2
k=1 m=1
 n n−1 n

1   
= E X(kT )X(kT ) + 2 X(kT )X(mT )
n2
k=1 k=1 m=k+1
n n−1 n−k
1  2 
= R(0) + 2 E{X(kT )X((k + m)T )}
n2 n m=1
k=1 k=1
n−1
 n−k
 n−1 n−m
n 2 n 2  
= R(0) + 2 R(mT ) = R(0) + R(mT )
n2 n n2 n2 m=1
k=1 m=1 k=1
n−1
 n−1

n 2 1
= 2
R(0) + 2 (n − m)R(mT ) = (n − |m|)R(|m|T ).
n n m=1
n2
m=−(n−1)

Using

R(−mT ) = E{X(kT )X((k − m)T )} = E{X(kT )X((k + m)T )} = R(mT ),

and the hint and the inversion formula we get


n−1
 n−1
  ∞
  1 1 1
E Y2 = (n − |m|)R(mT ) = (n − |m|) · e−imωT S(ω) dω
n2 n2 2π −∞
m=−(n−1) m=−(n−1)
 
1 2
 ∞ n−1
  ∞ sin nωT
1 1 2
= S(ω) (n − |m|)e−iωmT dω = S(ω) ·   dω,
2πn2 −∞ 2πn2 −∞ 1
m=−(n−1) sin2 ωT
2
and the formula is proved.

Download free ebooks at bookboon.com

124
Stochastic Processes 2 5. Other types of stochastic processes

Example 5.7 Let {W (t), t ≥ 0} be a Wiener process..

1) Find the autocorrelation R(s, t) and the autocovariance C(s, t), s, t ∈ R + .

2) Let 0 < s < t. Find the simultaneous frequency of the two-dimensional random variable {W (s), W (t)}.

The Wiener process is a normal process {W (t), t ≥ 0} with

W (0) = 0, m(t) = 0, V {W (t)} = α t (α > 0),

and of independent increments. It follows from m(t) = 0 that

C(s, t) = Cov{W (s), W (t)} = R(s, t) − m(s)m(t) = R(s, t).

1) If 0 < s < t, then

R(s, t) = C(s, t) = Cov{W (s), W (t)} = Cov{W (s), W (s) + [W (t) − W (s)]}
= Cov{W (s), W (s)} + Cov{W (s), W (t) − W (s)}
= V {W (s)} + 0 (independent increments)
= α · s.

Analogously, R(s, t) = C(s, t) = α · t, if 0 < t < s, thus



αs, if 0 < s < t,
R(s, t) = C(s, t) = α · min{s, t} =
αt, if 0 < t < s.

2) If 0 < s < t, then (W (s), W (t) − W (s)) has the simultaneous frequency
   
1 1 x2 1 1 y2
f (x, y) = √ exp − · , exp −
2παs 2 αs 2πα(t − s) 2 α(t − s)

for (x, y) ∈ R2 . Finally, it follows that

(W (s), W (t)) = (W (s), {W (t) − W (s)} + W (s))

has the frequency


  
1 1 x2 (y − x)2
g(x, y) = f (x, y − x) = , exp − + , (x, y) ∈ R2 .
2πα s(t − s) 2 αs α(t − s)

Download free ebooks at bookboon.com

125
Stochastic Processes 2 Index

Index
absorbing state, 13, 25 state of a process, 4
Arcus sinus law, 10 stationary distribution, 11, 43, 50
stationary Markov chain, 10
closed subset of states, 13 stochastic limit matrix, 13
convergence in probability, 28 stochastic matrix, 10
cycle, 22 stochastic process, 4
symmetric random walk, 5, 9
discrete Arcus sinus distribution, 10
distribution function of a stochastic process, 4 transition probability, 10, 11
double stochastic matrix, 22, 39
drunkard’s walk, 5 vector of state, 11

Ehrenfest’s model, 32

geometric distribution, 124, 133

initial distribution, 11
invariant probability vector, 11, 22, 23, 25, 26,
28, 30, 32, 36, 39
irreducible Markov chain, 12, 18–23, 32, 36, 39,
41, 43, 45, 47, 50, 53, 62, 65, 67, 70,
73, 75, 78, 80, 86, 88, 91, 93, 98, 103,
106, 108, 114, 116, 122, 125, 128, 131
irreducible stochastic matrix, 83, 120

limit matrix, 13

Markov chain, 10, 18


Markov chain of countably many states, 101
Markov process, 5

outcome, 5

periodic Markov chain, 14


probability of state, 11
probability vector, 11

random walk, 5, 14, 15


random walk of reflecting barriers, 14
random walk of absorbing barriers, 14
regular Markov chain, 12, 18–23, 36, 39, 43, 47,
50, 53, 56, 62, 65, 67, 70, 73, 75, 78,
80, 83, 86, 88, 91, 100, 101, 103, 106,
108, 114, 116, 122, 125, 128, 131
regular stochastic matrix, 26, 30, 120
ruin problem, 7

sample function, 4

Download free ebooks at bookboon.com

126

You might also like