Stochastic Processes 2
Stochastic Processes 2
Processes 2
Probability Examples c9
Leif Mejlbro
Download free books at
Leif Mejlbro
2
Probability Examples c-9 – Stochastic Processes 2
© 2009 Leif Mejlbro & Ventus Publishing ApS
ISBN 978-87-7681-525-7
3
Stochastic Processes 2 Contents
Contents
Introduction 5
1 Theoretical background 6
1.1 The Poisson process 6
1.2 Birth and death processes 8
1.3 Queueing theory in general 11
1.4 Queueing system of innitely many shop assistants 11
1.5 Queueing system of a nite number of shop assistants, and with forming of queues 12
1.6 Queueing systems with a nite number of shop assistants and without queues 15
1.7 Some general types of stochastic processes 17
2 The Poisson process 19
3 Birth and death processes 37
4 Queueing theory 52
5 Other types of stochastic processes 117
Index 126
Fast-track
your career
Please click the advert
4
Stochastic Processes 2 Introduction
Introduction
This is the ninth book of examples from Probability Theory. The topic Stochastic Processes is so big
that I have chosen to split into two books. In the previous (eighth) book was treated examples of
Random Walk and Markov chains, where the latter is dealt with in a fairly large chapter. In this book
we give examples of Poisson processes, Birth and death processes, Queueing theory and other types
of stochastic processes.
The prerequisites for the topics can e.g. be found in the Ventus: Calculus 2 series and the Ventus:
Complex Function Theory series, and all the previous Ventus: Probability c1-c7.
Unfortunately errors cannot be avoided in a first edition of a work of this type. However, the author
has tried to put them on a minimum, hoping that the reader will meet with sympathy the errors
which do occur in the text.
Leif Mejlbro
27th October 2009
5
Stochastic Processes 2 1. Theoretical background
1 Theoretical background
1.1 The Poisson process
Given a sequence of independent events, each of them indicating the time when they occur. We
assume
1. The probability that an event occurs in a time interval I [0, +∞[ does only depend on the length
of the interval and not of where the interval is on the time axis.
2. The probability that there in a time interval of length t we have at least one event, is equal to
λt + t ε(t),
3. The probability that we have more than one event in a time interval of length t is t ε(t).
It follows that
1 − λt + tε(t).
5. The probability that there is precisely one event in a time interval of length t is λt + t ε(t).
Here ε(t) denotes some unspecified function, which tends towards 0 for t → 0.
Looking for a career where your ideas could really make a difference? UBS’s
Graduate Programme and internships are a chance for you to experience
for yourself what it’s like to be part of a global team that rewards your input
and believes in succeeding together.
Wherever you are in your academic career, make your future a part of ours
by visiting www.ubs.com/graduates.
www.ubs.com/graduates
6
Stochastic Processes 2 1. Theoretical background
Given the assumptions on the previous page, we let X(t) denote the number of events in the interval
]0, t], and we put
Then X(t) is a Poisson distributed random variable of parameter λt. The process
is called a Poisson process, and the parameter λ is called the intensity of the Poisson process.
(λt)k −λt
Pk (t) = e , for k ∈ N0 ,
k!
proving that X(t) is Poisson distributed with parameter λt.
Remark 1.1 Even if Poisson processes are very common, they are mostly applied in the theory of
tele-traffic. ♦.
If X(t) is a Poisson process as described above, then X(s + t) − X(s) has the same distribution as
X(t), thus
(λt)k −λt
P {X(s + t) − X(s)} = e , for k ∈ N0 .
k!
If 0 ≤ t1 < t2 ≤ t3 < t4 , then the two random variables X (t4 ) − X (t3 ) and X (t2 ) − X (t1 ) are
independent. We say that the Poisson process has independent and stationary growth.
7
Stochastic Processes 2 1. Theoretical background
The event function of a Poisson process is a step function with values in N 0 , each step of the size
+1. We introduce the sequence of random variables T1 , T2 , . . . , which indicate the distance in time
between two succeeding events in the Poisson process. Thus
Yn = T 1 + T 2 + · · · + T n
All random variables T1 , T2 , . . . , Tn are mutually independent and exponentially distributed of pa-
rameter λ, hence
Yn = T 1 + T 2 + · · · + T n
1
is Gamma distributed, Yn ∈ Γ n , .
λ
Connection with Erlang’s B-formula. Since Yn+1 > t, if and only if X(t) ≤ n, we have
We assume that there are non-negative constants λk and μk , such that for k ∈ N,
1) P {one positive signal in ] t, t + h [| X(t) = k} = λk h + h ε(h).
2) P {one negative signal in ] t, t + h [| X(t) = k} = μk h + h ε(h).
8
Stochastic Processes 2 1. Theoretical background
A simple analysis shows for k ∈ N and h > 0 that the event {X(t + h) = k} is realized in on of the
following ways:
• X(t) = k, and no signal in ] t, t + h [.
• X(t) = k − 1, and one positive signal in ] t, t + h [.
• X(t) = k + 1, and one negative signal in ] t, t + h [.
• More signals in ] t, t + h [.
We put
By a rearrangement and taking the limit h → 0 we easily derive the differential equations of the
process,
⎧
⎨ P0 (t) = −λ0 P0 (t) + μ1 P1 (t), for k = 0,
⎩
Pk (t) = − (λk + μk ) Pk (t) + λk−1 Pk−1 (t) + μk+1 Pk+1 (t), for k ∈ N.
In the special case of a pure birth process, where all μk = 0, this system is reduced to
⎧
⎨ P0 (t) = −λ0 P0 (t), for k = 0,
⎩
Pk (t) = −λk Pk (t) + λk−1 Pk−1 (t), for k ∈ N.
If all λk > 0, we get the following iteration formula of the complete solution,
⎧
⎨ P0 (t) = c0 e−λ0 t , for k = 0,
⎩ t
Pk (t) = λk−1 e−λk t 0
eλk τ Pk−1 (τ ) dτ + ck e−λk t , for k ∈ N.
From P0 (t) we derive P1 (t), etc.. Finally, if we know the initial distribution, we are e.g. at time t = 0
in state Em , then we can find the values of the arbitrary constants ck .
Let {X(t) | t ∈ [0, +∞[} be a birth and death process, where all λk and μk are positive, with the
exception of μ0 = 0, and λN = 0, if there is a final state EN . The process can be in any of the states,
therefore, in analogy with the Markov chains, such a birth and death process is called irreducible.
Processes like this often occur in queueing theory.
If there exists a state Ek , in which λk = μk , then Ek is an absorbing state, because it is not possible
to move away from Ek .
For the most common birth and death processes (including all irreducible processes) there exist non-
negative constants pk , such that
9
Stochastic Processes 2 1. Theoretical background
we say that the solution (pk ) is a stationary distribution, and the pk are called the stationary proba-
bilities. In this case we have
λk−1 λk−2 · · · λ1 λ0
pk = · p0 := ak p0 , for k ∈ N0 ,
μk μk−1 · · · μ2 μ1
The condition of the existence of a stationary distribution is then reduced to that the series k ak is
1
convergent of finite sum a > 0. In this case we have p0 = .
a
Please click the advert
10
Stochastic Processes 2 1. Theoretical background
1) By the arrival distribution (the arrival process) we shall understand the distribution of the arrivals
of the customers to the service (the shop). This distribution is often of Poisson type.
2) It the arrivals follow a Poisson process of intensity λ, then the random variable, which indicates
the time difference between two succeeding arrivals exponentially distributed of parameter λ. We
say that the arrivals follow an exponential distribution, and λ is called the arrival intensity.
3) The queueing system is described by the number of shop assistants or serving places, if there is
the possibility of forming queues or not, and the way a queue is handled. The serving places are
also called channels.
4) Concerning the service times we assume that if a service starts at time t, then the probability that
it is ended at some time in the interval ]t, t + h[ is equal to
k μ h + h ε(h).
We shall in the following sections consider the three most common types of queueing systems. Concern-
ing other types, cf. e.g. Villy Bæk Iversen: Teletraffic Engineering and Network Planning Technical
University of Denmark.
The process is irreducible, and the differential equations of the system are given by
⎧
⎨ P0 (t) = −λ P0 (t) + μ P1 (t), for k = 0,
⎩
Pk (t) = −(λ + k μ)Pk (t) + λ Pk−1 (t) + (k + 1)μ Pk+1 (t), for k ∈ N.
(k + 1)μ pk+1 = λ pk , k ∈ N0 ,
11
Stochastic Processes 2 1. Theoretical background
of the solutions
k
1 λ λ
pk = exp − , k ∈ N0 .
k! μ μ
These are the probabilities that there are k customers in the system, when we have obtained equilib-
rium.
The system of differential equations above is usually difficult to solve. One has, however, some partial
results, e.g. the expected number of customers at time t, i.e.
+∞
m(t) := k Pk (t),
k=1
m (t) + μ m(t) = λ.
1.5 Queueing system of a finite number of shop assistants, and with form-
ing of queues
We consider the case where
Spelled out, we have N shop assistants and a customer, who arrives at state E k . If k < N , then the
customer goes to a free shop assistant and is immediately serviced. If however k = N , thus all shop
assistants are busy, then he joins a queue and waits until there is a free shop assistant. We assume
here queueing culture.
With a slight change of the notation it follows that if there are N shop assistants and k customers
(and not k states as above), where k > N , then there is a common queue for all shop assistants
consisting of k − N customers.
This process is described by the following birth and death process {X(t) | t ∈ [0, +∞[} of the
parameters
⎧
⎨ k μ, for k < N,
λk = λ and μk =
⎩
N μ, for k ≥ N.
12
Stochastic Processes 2 1. Theoretical background
Remark 1.2 Together with the traffic intensity one also introduce in teletraffic the offer of traffic.
By this we mean the number of customers who at the average arrive to the system in a time interval of
λ
length equal to the mean service time. In the situation above the offer of traffic is . Both the traffic
μ
intensity and the offer of traffic are dimensionless. They are both measured in the unit Erlang.♦
The condition that (pk ) become stationare probabilities is that the traffic intensity < 1, where
+∞
NN k ( N )N
= .
N! (1 − ) · N !
k=N
If, however, ≥ 1, it is easily seen that the queue is increasing towards infinity, and there does not
exist a stationary distribution.
1) If N = 1, then
pk = k (1 − ), for k ∈ N0 .
2) If N = 2, then
⎧
⎪ 1−
⎪
⎪ , for k = 0,
⎨ 1+
pk =
⎪
⎪ 1−
⎪
⎩ 2k · , for k ∈ N.
1+
3) If N > 2, the formulæ become somewhat complicated, so they are not given here.
13
Stochastic Processes 2 1. Theoretical background
The average number of customers at the service is under the given assumptions,
⎧
⎪
⎨ 1 − , for N = 1,
⎪
⎩
+∞
k=1 k pk , generelt (naturligvis).
your chance
to change
the world
Please click the advert
14
Stochastic Processes 2 1. Theoretical background
The waiting time of a customer is defined as the time elapsed from his arrival to the service of him
starts. The staying time is the time from his arrival until he leaves the system after the service of
him. Hence we have the splitting
which by a computation is
⎧
⎪
⎪ , for N = 1,
⎪
⎨ μ(1 − )
V =
⎪
⎪ N · N N −1
⎪
⎩ · p0 , generelt.
μ · N ! · (1 − )2
The average length of the queue (i.e. the mean number of customers in the queue) is
+∞
N +1 · N N
λV = (k − N )pk = · p0 .
N ! · (1 − )2
k=N +1
1.6 Queueing systems with a finite number of shop assistants and without
queues
We consider here the case where
15
Stochastic Processes 2 1. Theoretical background
In this case the process is described by the following birth and death process {X(t) | t ∈ [0, +∞[}
with a finite number of states E0 , E1 , . . . , EN , where the intensities are given by
⎧
⎨ λ, for k < N,
λk = and μk = k μ.
⎩
0, for k ≥ N,
In general, this system is too complicated for a reasonable solution, so instead we use the stationary
probabilities, which are here given by Erlang’s B-formula:
k
1λ
k!
μ
pk = , for k = 0, 1, 2, . . . , N.
N 1 λ j
j=0
j! μ
The average number of customers who are served, is of course equal to the average number of busy
shop assistants, or channels. The common value is
N
λ
k pk = (1 − pN ) .
μ
k=1
We notice that pN can be interpreted as the probability of rejection. This probability pN is large,
when λ >> μ. We get from
λ
N j exp +∞
1 λ μ
= y N e−y dy,
j=0
j! μ N ! λ/μ
16
Stochastic Processes 2 1. Theoretical background
2) the auto-correlation,
3) the auto-covariance,
4) the cross-correlation,
5) the cross-covariance,
P {X (t1 + h) ≤ x1 ∧ · · · ∧ X (tn + h) ≤ xn }
m(t) = m,
and the auto-covariance C(s, t) becomes a function in the real variable s − t. We therefore write in
this case,
Conversely, if m(t) = m and C(s, t) = C(s − t), then we call the stochastic process {X(t) | t ∈ R}
weakly stationary.
17
Stochastic Processes 2 1. Theoretical background
i.e. as the Fourier transformed of R(τ ). Furthermore, if we also assume that S(ω) is absolutely
integrable, then we can apply the Fourier inversion formula to reconstruct R(τ ) from the effect
spectrum,
+∞
1
R(τ ) = e−iωτ S(ω) dω.
2π −∞
In particular,
+∞
1
E |X(t)|2 = R(0) = S(ω) dω.
2π −∞
A stochastic process {X(t) | t ∈ T } is called a normal process, or a Gaußiann process, if for every
n ∈ N and every t1 , . . . , tn ∈ T the distribution of {X (t1 ) , . . . , X (tn )} is an n-dimensional normal
distribution. A normal process is always completely specified by its mean value function m(t) and its
auto-covariance function C(s, t).
The most important normal process is the Wiener process, or the Brownian movements
{W (t) | t ≥ 0}.
This is characterized by
1) W (0) = 0,
2) m(t) = 0,
18
Stochastic Processes 2 2. The Poisson process
P {T ≤ t | X (t0 ) = 1} .
www.maersk.com/mitas
19
Stochastic Processes 2 2. The Poisson process
Example 2.2 Let {X1 (t), t ≥ 0} and {X2 (t), t ≥ 0} denote two independent Poisson processes of
intensity λ1 and λ2 , resp., and let the process {Y (t), t ≥ 0} be defined by
Y (t) = X1 (t) + X2 (t).
Prove that {Y (t), t ≥ 0} is a Poisson process.
We first identify
n
(λ1 t) −λ1 t
Pn (t) = P {X(t) = n} = e ,
n!
and
n
(λ2 t) −λ2 t
Qn (t) = P {X(t) = n} = e .
n!
We get from X1 (t) and X2 (t) being independent that
P {Y (t) = n} = P {X1 (t) + X2 (t) = n}
n n j n−j
(λ1 t) −λ1 t (λ2 t)
= P {X1 (t) = j} · P {X2 (t) = n − j} = e · e−λ2 t
j=0 j=0
j! (n − j)!
n n n
n! t −(λ1 +λ2 )t n tn −(λ1 +λ2 )t
= λj1 · λn−j
2 · e = λj1 λn−j
2 · e
j!(n − j)!
j=0
n! j
j=0
n!
n
n t
= (λ1 + λ2 ) · · exp (− (λ1 + λ2 ) t) .
n!
It follows that {Y (t), t ≥ 0} is also a Poisson process (of intensity λ 1 + λ2 ).
Example 2.3 A Geiger counter only records every second particle, which arrives to the counter.
Assume that the particles arrive according to a Poisson process of intensity λ. Denote by N (t) the
number of particles recorded in ]0, t], where we assume that the first recorded particle is the second to
arrive.
1. Find P {N (t) = n}, n ∈ N0 .
2. Find E{N (t)}.
Let T denote the time difference between two succeeding recorded arrivals.
3. Find the frequency of T .
4. Find the mean E{T }.
1. It follows from
(λt)n −λt
Pn (t) = e , n ∈ N0 ,
n!
that
(λt)2n (λt)2n+1
P {N (t) = n} = P2n (t) + P2n+1 (t) = + e−λt
(2n)! (2n + 1)!
(λt)2n
= (2n + 1 + λt)e−λt , n ∈ N0 .
(2n + 1)!
20
Stochastic Processes 2 2. The Poisson process
2. The mean is
∞
∞ ∞
n(λt)2n n(λt)2n+1
−λt
E{N (t)} = n P {N (t) = n} = e +
n=0 n=1
(2n)! n=1
(2n + 1)!
∞
∞ ∞
λt (λt)2n+1
−λt
(n + 12 )(λt)2n+1 1 (λt)2n+1
= e + −
2 n=0 (2n + 1)! n=1 (2n + 1)! 2 n=1 (2n + 1)!
−λt λt λt 1
= e · sinh λt + (cosh λt − 1) − (sinh λt − λt)
2 2 2
λt λt 1 λt λt 1 1 −2λt
= e−λt ·e − e − e−λt = − + e .
2 4 2 4 4
1
3. & 4. It follows from T = Tj + Tj+1 that T ∈ Γ 2 , , thus the frequency is
λ
⎧ 2 −λx
⎨ λ xe for x > 0,
f (x) =
⎩
0 for x ≤ 0,
Example 2.4 From a ferry port a ferry is sailing every quarter of an hour. Each ferry can carry N
cars. The cars are arriving to the ferry port according to a Poisson process of intensity λ (measured
in quarter−1 ).
Assuming that there is no car in the ferry port immediately after a ferry has sailed at 9 00 , one shall
1) find the probability that there is no car waiting at 915 (immediately after the departure of the next
ferry),
2) find the probability that no car is waiting at 930 (immediately after the departure of the next ferry).
1
3) A motorist arrives at p07 2 . What is the probability that he will not catch the ferry at p15 , but
instead the ferry at 930 ?
(λt)n −λt
P {X(t) = n} = e , n ∈ N0 .
n!
1) From t = 1 follows that the wanted probability is
N
λn −λ
P {X(1) ≤ N } = e .
n=0
n!
21
Stochastic Processes 2 2. The Poisson process
a) Either there has arrived during the first quarter of an hour ≤ N cars, which are all carried
over, so we allow during the next quarter N cars to arrive,
b) or during the first quarter N + j cars have arrived, 1 ≤ j ≤ N , and at most N − j cars in the
second quarter.
N
P {X(1) ≤ N } · P {X(1) ≤ N } + P {X(1) = N + j} · P {X(1) ≤ N − j}
j=1
2 ⎧ 2 ⎫
N
λn −λ N
λN +j
N
−j n
λ ⎨ N
λ n N N−j
λ N +j+n ⎬
= e + e−λ · e−λ = e−2λ +
n! (N + j)! n! ⎩ n! n!(N + j)! ⎭
n=0 j=1 n=0 n=0 j=1 n=0
⎧ 2 N −1 N −n ⎫
⎨ N
λ n λN +j+n ⎬
= e−2λ + .
⎩ n! n!(N + j)! ⎭
n=0 n=0 j=1
1
3) Now the time 907 2 corresponds to t = 12 , so the probability is
N
2N n
1 λ 1 λ
P X = N + j = exp − .
j=0
2 2 n! 2
n=N
1) The time distance between two succeeding buses is exponentially distributed of mean 20 minutes,
and since the exponential distribution is “forgetful”, de average waiting time must be 20 minutes.
2) He arrives at a random time between two succeeding busses, so by the “symmetry” the average
waiting time is instead 12 · 20 minutes = 10 minutes.
At this time Mr. Smith’s bus arrives, and he forgets to think of this contradiction.
Can you decide which of the two arguments is correct and explain the mistake in the wrong argument?
The argument of (1) is correct. The mistake of (2) is that the length of the time interval, in which
Mr. Smith arrives, is not exponentially distributed. In fact, there will be a tendency of Mr. Smith to
arrive in one of the longer intervals.
This is more precisely described in the following way. Let t denote Mr. Smith’s arrival time. Then
22
Stochastic Processes 2 2. The Poisson process
1)
Choose Accenture for a career where the variety of opportunities and challenges allows you to make a
difference every day. A place where you can develop your potential and grow professionally, working
alongside talented colleagues. The only place where you can learn from our unrivalled experience, while
helping our global clients achieve high performance. If this is your idea of a typical working day, then
Accenture is the place to be.
It all starts at Boot Camp. It’s 48 hours packed with intellectual challenges and intense learning experience.
that will stimulate your mind and and activities designed to let you It could be your toughest test yet,
enhance your career prospects. You’ll discover what it really means to be a which is exactly what will make it
spend time with other students, top high performer in business. We can’t your biggest opportunity.
Accenture Consultants and special tell you everything about Boot Camp,
guests. An inspirational two days but expect a fast-paced, exhilarating Find out more and apply online.
Visit accenture.com/bootcamp
23
Stochastic Processes 2 2. The Poisson process
We shall now prove (1). First notice that the frequencies of the Sn are given by
λn
gn (x) = xn−1 e−λx , x > 0.
(n − 1)!
(a) First assume that x < t. Then the even occurs that the interval has the length ≤ x, if
By a differentiation,
(b) Then let x > t. The event occurs that the interval has length ≤ x, if either
24
Stochastic Processes 2 2. The Poisson process
Please click the advert
In Paris or Online
International programs taught by professors and professionals from all over the world
Visit: www.HorizonsUniversity.org
Write: [email protected]
Call: 01.42.77.20.66 www.HorizonsUniversity.org
25
Stochastic Processes 2 2. The Poisson process
Example 2.6 Denote by {X(t), t ≥ 0} a Poisson process of intensity a, and let ξ be a fixed positive
number. We define a random variable V by
V = inf{v ≥ ξ | there is no event from the Poisson process in the interval ]v − ξ, v]}.
xi V
(On the figure the τi indicate the times of the i-th event of the Poisson process, V the first time when
we have had an interval of length ξ without any event).
1) Prove that the distribution function F (v) of V fulfils
⎧ ξ
⎨ e−aξ + 0 F (v − x) a e−ax dx, v ≥ ξ,
(2) F (v) =
⎩
0, v < ξ.
(a + λ)e−(a+λ)ξ
L(λ) = .
λ + a e−(a+λ)ξ
Hint: Use that
∞
1
F (v) e−λv dv = L(λ) for λ > 0.
0 λ
26
Stochastic Processes 2 2. The Poisson process
If v > ξ, then τi = v − ξ and v − x ∈ ]v − ξ, c] for x ∈ [0, ξ[, and we are led to the following
computation
Here (3) is a generalized sum (i.e. an integral), where V = x and T > v − x, which of course will
contribute to F (v).
∞
2) If L(λ) = 0 f (v) e−λv dv then the Laplace transform of V is
∞ ∞
−λv 1 1
F (v) e dv = f (v) e−λv dv = L(λ) for λ > 0.
0 λ 0 λ
thus
−(a+λ)ξ a a λ + a e−(a+λ)ξ
e = L(λ) · 1 − + e−(a+λ)ξ = L(λ) · ,
a+λ a+λ a+λ
and hence
(a + λ)e−(a+λ)ξ
L(λ) = .
λ + a e−(a+λ)ξ
27
Stochastic Processes 2 2. The Poisson process
3) The mean is
Example 2.7 To a taxi rank taxis arrive from the south according to a Poisson process of intensity
a, and independently there also arrive taxis from the north according to a Poisson process of intensity
b.
We denote by X the random variable which indicates the number of taxies, which arrive from the
south in the time interval between two succeeding taxi arrivals from the north.
Find P {X = k}, k ∈ N0 , as well as the mean and variance of X.
The length of the time interval between two succeeding arrivals from the north has the frequency
When this length is a (fixed) t, then the number of arriving taxies from the south is Poisson distributed
of parameter a t. By the law of total probability,
∞
(a t)k −at −kt b ak ∞ k −(a+b)t
P {X = k} = e · be dt = t e dt
0 k! k! 0
k
b ak k! a b
= · k+1
= · , k ∈ N0 ,
k! (a + b) a+b a+b
b
so X ∈ N B 1, is negative binomially distributed..
a+b
It follows by some formula in any textbook that
a a a(a + b) a a
E{X} = 1 · = and V {X} = = 1+ .
b b b2 b b
28
Stochastic Processes 2 2. The Poisson process
Example 2.8 The number of car accidents in a given region is assumed to follow a Poisson process
{X(t), t ∈ [0, ∞[} of intensity λ, and the number of persons involved in the i-th accident is a random
variableYi , which is geometrically distributed,
P {Yi = k} = p q k−1 , k ∈ N,
where p > 0, q > 0 and p + q = 1. We assume that the Yi are mutually independent, and independent
of {X(t), t ≥ 0}.
1. Find the generating function of X(t).
2. Find the generating function of Yi .
Denote by Z(t) the total number of persons involved in accidents in the time interval ]0, t].
3. Describe the generating function of Z(t) expressed by the generating function of Y i and the gener-
ating function of X(t).
Hint: Use that
∞
P {Z(t) = k} = P {X(t) = i ∧ Y1 + Y2 + · · · + Yi = k} .
i=0
29
Stochastic Processes 2 2. The Poisson process
4) It follows from
p λt
PZ(t) (s) = λt · PZ(t) (s) med PZ(t) (1) = ,
(1 − qs)2 p
and
2
p 2pq
PZ(t) (s) = λt · PZ(t) (s) + λt · PZ(t) (s),
(1 − qs)2 (1 − qs)3
where
2
λt 2q
PZ(t) (1) = + λt · ,
p p2
that
λt
E{Z(t)} = PZ(t) (1) =
p
and
2 2
2 λt 2q λt λt
V {Z(t)} = P (1) + P (1) − (P (t)) = + λt · 2 + −
p p p p
2q + p 1+q
= λt · = λt · 2 .
p2 p
30
Stochastic Processes 2 2. The Poisson process
In the specific case the intensity is λ = 2, and the time span is t = 7 days. Furthermore, p = q = 12 ,
thus
2·7
E{Z(7)} = 1 = 28
2
and
1+ 1
V {Z(7)} = 2 · 7 · 22 = 2 · 7 · 6 = 84.
1
2
Example 2.10 Given a service to which customers arrive according to a Poisson process of intensity
λ (measured in the unit minut−1 ).
Denote by I1 , I2 and I3 three succeeding time intervals, each of the length of 1 minute.
1. Find the probability that there is no customer in any of the three intervals.
2. Find the probability that there is precisely one arrival of a customer in one of these intervals and
none in the other two.
3. Find the probability that there are in total three arrivals in the time intervals I 1 , I2 and I3 , where
precisely two of them occur in one of these intervals.
4. Find the value of λ, for which the probability found in 3. is largest.
Then consider 12 succeeding time intervals, each of length 1 minute. Let the random variable Z denote
the number of intervals, in which we have no arrival.
5. Find the distribution of Z.
6. For λ = 1 find the probability P {Z = 4} (2 dec.).
1) Let
Then
3
P {no event in I1 ∪ I2 ∪ I3 = ]0, 3]} = e−λ = e−3λ .
31
Stochastic Processes 2 2. The Poisson process
2) By a rearrangement,
P {one event in one interval, none in the other two} = P {one event in ]0, 3]} = 3λ e −3λ .
3) We have
P {two events in one interval, one in another one, and none in the remaining one}
= P {two events in one interval, one in the remaining two intervals}
λ2 −λ
=3· · e · 2λ e−2λ = 3λ3 e−3λ .
2
4) We conclude from 3. that g(λ) = 3λ3 e−3λ > 0 for λ > 0 with g(λ) → 0 for λ → 0+, and for
λ → ∞. By a differentiation,
g (λ) = 9λ2 − 9λ3 e−3λ = 9λ2 (1 − λ)e−3λ = 0 for λ = 1 > 0,
www.simcorp.com
32
Stochastic Processes 2 2. The Poisson process
we get
12 12−k
P {Z = k} = e−λk 1 − e−λ , k = 0, 1, 2, . . . , 12,
k
thus Z ∈ B 12, e−λ .
6) By insertion of λ = 1 an k = 4 into the result of 5. we get
12 −1 2 4 4
P {Z = 4} = e 1 − e−1 = 495 · 0.3679 · 0.63212 = 0.2313 ≈ 0.23.
4
1) We get from
ak −a
P {X = k} = e , k ∈ N0 ,
k!
the characteristic function
∞
∞
ak −a 1 iω k
kX (ω) = eiωk · e = e−a e a = e−a · exp a · eiω = exp a eiω − 1 .
k! k!
k=0 k=0
2) Put
X −a
Xa = √ .
a
33
Stochastic Processes 2 2. The Poisson process
34
Stochastic Processes 2 2. The Poisson process
Remark 2.1 For comparison a long and tedious computation on a pocket calculator gives
P {80 < X < 120} ≈ 0.9491. ♦
Example 2.12 In a shop there are two shop assistants A and B. Customers may freely choose if they
will queue up at A or at B, but they cannot change their decision afterwards. For all customers at A
their serving times are mutually independent random variables of the frequency
⎧
⎨ λ e−λx , x > 0,
f (x) = (λ is a positive constant),
⎩
0, x ≤ 0,
and for the customers at B the serving times are mutually independent random variables of frequency
⎧
⎨ 2λ e−2λy , y > 0,
g(y) =
⎩
0, y ≤ 0.
At a given time Andrew arrives and is queueing up at A, where there in front of him is only one
customer, and where the service of this customer has just begun. We call the serving time of this
customer X1 , while Andrew’s serving time is called X2 .
At the same time Basil arrives and joins the queue at B, where there in front of him are two waiting
customers, and where the service of the first customer has just begun. The service times of these two
customers are denoted Y1 and Y2 , resp..
1. Find the frequencies of the random variables X1 + X2 and Y1 + Y2 .
2. Express by means of the random variables Y1 , Y2 and X1 the event that the service of Basil starts
after the time when the service of Andrew has started, and find the probability of this event.
3. Find the probability that the service of Basil starts after the end of the service of Andrew.
Assume that the customers arrive to the shop according to a Poisson process of intensity α.
4. Find the expected number of customers, who arrive to the shop in a time interval of length t.
5. Let N denote the random variable, which indicates the number of customers who arrive to the shop
during the time when Andrew is in the shop (thus X1 + X2 ). Find the mean of N .
1 1
1) Since Xi ∈ Γ 1, is exponentially distributed we have X1 + X2 ∈ Γ 2, , thus
λ λ
⎧ 2 −λx
⎨ λ xe , x ≥ 0,
fX1 +X2 (x) =
⎩
0, x < 0,
1 1
Since Yi ∈ Γ 1, , we have Y1 + Y2 ∈ Γ 2, with the frequency
2λ 2λ
⎧
⎨ 4λ2 y e−2λy , y ≥ 0,
gY1 +Y2 (y) =
⎩
0, y < 0.
35
Stochastic Processes 2 2. The Poisson process
(αt)n −αt
P {X(t) = n} = e , n ∈ N0 ,
n!
and
∞
(αt)n −αt
m(t) = E{X(t)} = n e = α t.
n=0
n!
36
Stochastic Processes 2 3. Birth and death processes
and we assume that the process at t = 0 is in state E0 . It can be proved that the differential equations
have a uniquely determined solution (Pk (t)) satisfying
∞
Pk (t) ≥ 0, Pk (t) ≤ 1.
k=0
∞
∞
One can also prove that either k=0 Pk (t) = 1 for all t > 0, or k=0 Pk (t) < 1 for all t > 0.
Prove that
∞
∞ 1
k=0 Pk (t) = 1 for all t > 0, if and only if k=0 is divergent.
λk
Hint: First prove that
t
1 1
a(t) ≤ Pk (s) ds ≤ , k ∈ N0 , t > 0,
λk 0 λ k
∞
where a(t) = 1 − k=0 Pk (t).
hence by integration,
⎡ ⎤t
t k
k
k
λk Pk (s) ds = ⎣− Pj (s)⎦ = Pj (0)Pj (t) = 1 − Pj (t),
0 j=0 j=0 j=0
0
from which
t
1 1
a(t) ≤ Pk (s) ds ≤ .
λk 0 λk
37
Stochastic Processes 2 3. Birth and death processes
∞
Assume that k=0 Pk (t) = 1. Applying the theorem of monotonous convergence (NB The Lebesgue
integral!) it follows from the right hand inequality that
∞
∞ t
t
∞ t
1
≥ Pk (s) ds = Pk (s) ds = 1 dt = t for alle t ∈ R+ ,
λl 0 0 0
k=0 k=0 k=0
∞ 1
proving that the series k=0 is divergent.
λk
∞
Then assume that k=0 Pk (t) < 1, thus
∞
a(t) = 1 − Pk (t) > 0.
k=0
Using the theorem of monotonous convergence and the left hand inequality we get
∞ t
1
· a(t) ≤ Pk (s) ds ≤ t for all t ∈ R+ .
λk 0
k=0 k=0
∞ 1
and the series k=0 is convergent.
λk
Please click the advert
38
Stochastic Processes 2 3. Birth and death processes
Example 3.2 To a carpark, cars arrive from 900 (t = 0) following a Poisson process of intensity λ.
There are in total N parking bays, and we assume that no car leaves the carpark. Let E n , n = 0, 1,
. . . , N , denote the state that n of the parking bays are occupied.
Put λ = 1 minute−1 and N = 5. Find the probability that a car driver who arrives at 903 cannot find
a vacant parking bay.
thus
⎧
⎨ 0, n < N,
pn =
⎩
1, n = n.
39
Stochastic Processes 2 3. Birth and death processes
4) First identify
λ = 1 minute−1 , t = 3 and N = 5.
Then by insertion,
4 4
3n −3
P no parking bay at 903 = P5 (3) = 1 − Pn (3) = 1 − e = 0.1847 ≈ 0.185.
n=0 n=0
n!
Example 3.3 Given a stochastic birth and death process X(t), t ∈ [0, ∞[}, which can be in the states
E4 , E5 , E6 and E7 .
Assume that the birth intensity λk is in state Ek given by
λk = αk(7 − k),
μk = βk(k − 4),
Thus
1
λ4 12α 12 α
p5 = p4 = p4 = p4 ,
μ5 5β 5 β
2
λ5 10α 12 α α
p6 = p5 = · =2 p4 ,
μ6 12β 5 β β
2 3
λ6 6α α 4 α
p7 = p6 = ·2 = p4 .
μ7 21β β 7 β
Furthermore,
p4 + p5 + p6 + p7 = 1.
However, the exact values can first be found when we know the relationship between α and β.
1) If β = α, then
12 4 35 + 84 + 70 + 20 209
1 = p4 1 + +2+ = p4 = p4 ,
5 7 35 35
40
Stochastic Processes 2 3. Birth and death processes
hence
35 12 35 84
p4 = , p5 = · = ,
209 5 209 209
70 4 35 20
p6 = , p7 = · = ,
209 7 209 2+9
so
1
p = (p4 , p5 , p6 , p7 ) = (35, 84, 70, 20).
209
α 1
2) If β = 2α, then = , hence
β 2
6 1 1
p5 = p4 , p6 = p4 , p7 = p4 ,
5 2 14
and
6 1 1 70 + 84 + 35 + 5 97
1 = p 4 + p5 + p6 + p7 = p4 1 + + + = p4 = p4 ,
5 2 14 70 35
from which
35 42 35 5
p4 = , p5 = , p6 = , p7 = ,
97 97 194 194
i.e.
1
p = (p4 , p5 , p6 , p7 ) = (70, 84, 35, 5).
194
Try this...
Please click the advert
41
Stochastic Processes 2 3. Birth and death processes
Example 3.4 Given a birth and death process of the states E0 , E1 , E2 , . . . , birth intensities λk and
death intensities μk . Assume furthermore that
a. λk = μk = k α, k ∈ N0 , (where a is a positive constant).
b. P1 (0) = 1.
One may now without proof use that under the assumptions above,
1
P1 (t) = .
(1 + αt)2
1) We have
and
Pk (t) = − (λk + μk ) Pk (t) + λk−1 Pk−1 (t) + μk+1 Pk+1 (t)
= (k − 1)αPk−1 (t) − 2kαPk (t) + (k + 1)αPk+1 (t) for k ∈ N.
by an integration that
t t
α dτ 1 1 αt
P0 (t) = 2
= − =1− = .
0 (1 + ατ ) 1 + ατ 0 1 + αt 1 + αt
If k = 1, we get by a rearrangement,
1 1 2α 2α
P2 (t) = {P1 (t) − 0 · P0 (t) + 2α P1 (t)} = − +
2α 2α (1 + αt)3 (1 + αt)2
1 1 αt
= 2
− 3
= .
(1 + αt) (1 + αt) (1 + αt)3
42
Stochastic Processes 2 3. Birth and death processes
If k = 2, we get by a rearrangement,
1
P3 (t) = {P (t) − α P1 (t) + 4α P2 (t)}
3α 2
1 3α 2α α 4α 4α
= − − + −
3α (1+αt)4 (1+αt)3 (1+αt)2 (1+αt)2 (1+αt)3
1 3α 6α 3α
= − +
3α (1 + αt)4 (1 + αt)3 (1 + αt)2
(1 + αt)2 − 2(1 + αt) + 1 α 2 t2
= 4
= .
(1 + αt) (1 + αt)4
Summing up,
αt 1
P0 (t) = , P1 (t) = ,
1 + αt (1 + αt)2
2 2
αt α t
P2 (t) = , P3 (t) = .
(1 + αt)3 (1 + αt)4
0.8
0.6
0.4
0.2
x
Figure 1: The graph of 1 − with x = αt.
(1 + x)2
3) It follows that
αt 1 1 + αt + α2 t2 αt
P0 (t) + P1 (t) = + 2
= =1− .
1 + αt (1 + αt) (1 + αt)2 (1 + αt)2
If we put x = αt, we see that we shall only sketch
x 1 1
1− =1− + ,
(1 + x)2 1 + x (1 + x)2
which has a minimum for x = 1, and has y = 1 as an asymptote.
43
Stochastic Processes 2 3. Birth and death processes
0.8
0.6
0.4
0.2
x
Figure 2: The graph of with x = αt.
(1 + x)3
5) Clearly,
αt
lim P0 (t) = lim = 1.
t→∞ t→∞ 1 + αt
We conclude from
∞
Pn (t) = 1 and Pn (t) ≥ 0,
n=0
that
∞
lim Pn (t) = 0,
t→∞
n=1
hence
lim Pn (t) = 0 for alle n ∈ N.
t→∞
44
Stochastic Processes 2 3. Birth and death processes
Example 3.5 A power station delivers electricity to N customers. If a customer at time t uses
electricity there is the probability μh + h ε(h) that he does not use electricity at time t + h, and
probability 1 − μh + h ε(h) that he is still using electricity at time t + h.
However, if he to time t does not use electricity, then there is the probability λh + h ε(h) that he uses
electricity at time t + h, and probability 1 − λh + h ε(h) that he does not do it.
The customers are using electricity mutually independently.
Denote by Ek the state that k consumers use electricity, k = 0, 1, . . . , N .
Find the differential equations of the system.
Find the stationary probabilities.
We put Xk (t) = 1, if the k-th customer uses electricity at time t, and Xk (t) = 0, if he does not do it.
Let n and j ∈ {0, 1, . . . , N }, and assume that the system is in state E j , i.e.
N
Xk (t) = j at time t.
k=1
There must be an m ∈ {0, 1, . . . , j}, such that j − m of the customers who were using electricity at
time t, still are using electricity at time t + h.
Furthermore, n − j + m of the customers, who did not use electricity at time t, must use electricity
at time t + h, is we are in state En .
Thus we get the condition m ≥ j − n, so
Fast-track
your career
Please click the advert
45
Stochastic Processes 2 3. Birth and death processes
46
Stochastic Processes 2 3. Birth and death processes
It follows immediately that if j = n + 1, then all these terms are of the type hε(h).
For j = n + 1 we get the contribution
n+1 N −n−1
μh(1 − μh)n (1 − λh)N −n−m + hε(h) = (n + 1)μh + hε(h).
1 0
There are some modifications for n = 0 and n = N , in which cases we get instead
and
47
Stochastic Processes 2 3. Birth and death processes
Now
1 1
λ N λ
p1 = N · p0 = · p0 ,
μ 1 μ
so we guess that we in general have
m
N λ
pn = · p0 .
n μ
This is true for n = 0, 1, 2.
Assume that the claim holds for all indices up to n. If n ≤ N − 1, then
n N −n λ N −n+1 λ
pn+1 = + · pn − · pn−1
n+1 n+1 μ n+1 μ
n n+1
n N! λ N −n N! λ
= · + · p0
n + 1 n!(N − n)! μ n + 1 n!(N − n)! μ
n
N −n+1 N! λ
− · p0
n+1 (n − 1)!(N − n + 1)! μ
n n
N! λ N! λ
= p0 − p0
(n + 1) · (n − 1)!(N − n)! μ (n + 1)(n − 1)!(N − n)! μ
n+1
N! λ
+ p0
(n + 1)!(N − n − 1)! μ
n+1
N λ
= p0 ,
n+1 μ
and the claim follows by induction. Then
N
N n N N
N λ λ λ+μ
1= pn = p 0 = p0 · 1 + = p0 ,
n μ μ μ
n=0 n=0
hence
n N n N −n
N λ μ N λ μ
pn = · · = .
n μ λ+μ n λ+μ λ+μ
The solution above is somewhat clumsy, though it follows the ordinary way one would solve problems
of this type without too much training.
Alternatively we see that we have a birth and death process of states E 0 , E1 , . . . , EN , and
intensities
48
Stochastic Processes 2 3. Birth and death processes
μk pk = λk−1 pk−1 , k = 1, 2, . . . , N,
thus
N −k+1 λ
pk = · · pk−1 .
k μ
Then by recursion,
k k k
(N − k + 1)(N − k + 2) · N λ N! λ N λ
pk = · p0 = p0 = p0 .
k · (k − 1) · 1 μ k!(N − k)! μ k μ
that
k k N k N −k
N λ λ μ N λ μ
pk = p0 = · = · ,
k μ k μ λ+μ k λ+μ λ+μ
λ λ
for k = 0, 1, 2, . . . , N , so we get a binomial distribution B N, of mean N · .
λ+μ λ+μ
Looking for a career where your ideas could really make a difference? UBS’s
Graduate Programme and internships are a chance for you to experience
for yourself what it’s like to be part of a global team that rewards your input
and believes in succeeding together.
Wherever you are in your academic career, make your future a part of ours
by visiting www.ubs.com/graduates.
www.ubs.com/graduates
49
Stochastic Processes 2 3. Birth and death processes
Example 3.6 Given a stochastic process {X(t), t ∈ [0, ∞[} by the following: At time t = 0 there are
N cars in a carpark. No car arrives, and the cars leave the carpark mutually independently. If a car
is staying at its parking bay at time t, then there is the probability μh + hε(h) [where μ is a positive
constant] that it leaves the carpark in the time interval ]t, t + h].
Put X(t) = k, k = 0, 1, . . . , N , if there are k cars in the carpark at time t, and put
1) This follows e.g. from the fact that the probability that one of the k cars leaves the carpark in the
time interval ]t, t + h] is
k pk = 0, k = 0, 1, . . . , N.
N
Since k=0 pk = 1, we get
pk = 0 for k = 1, 2, . . . , N and p0 = 1.
50
Stochastic Processes 2 3. Birth and death processes
N
N
N
−1
k Pk (t) = −μ k 2 Pk (t) + μ k(k + 1)Pk+1 (t)
k=1 k=1 k=1
N
N
N
= −μ k 2 Pk (t) + μ (k − 1 = jPk (t) = −μ k Pk (t),
k=1 k=1 k=1
and
N N −k
Pk (t) = e−kμt 1 − e−μt , k = 0, 1, . . . , N.
k
51
Stochastic Processes 2 4. Queueing theory
4 Queueing theory
Example 4.1 Customers arrive to a shop by a Poisson process of intensity λ. There are 2 shop
assistants and possibility of forming a queue. We assume that the service times are exponentially
distributed of parameter μ.
1
It is given that there are no customers in the shop in at the average 10 % of the time and that = 11.
λ
1
Find .
μ
Then find the probability that both shop assistants are busy.
1 1
Here, N = 2 and p0 = and = 11. In fact, it was given that P0 (t) → p0 = 10 % for t → ∞.
10 λ
The traffic intensity is for N = 2 given by
1− 1 9
p0 = = , hvoraf = .
1+ 10 11
On the other hand, the traffic intensity is defined by
λ λ 1 9 1
= = = = , dvs. = 18.
Nμ 2μ 2 · 11 μ 11 μ
Please click the advert
52
Stochastic Processes 2 4. Queueing theory
Hence
1− 9 1
p1 = 2 · =2· · ,
1+ 11 10
and therefore,
1 18 81
P {both shop assistants busy} = 1 − p0 − p1 = 1 − − = .
10 110 110
Example 4.2 Customers arrive to a shop following a Poisson process of intensity λ. We have 1
shop assistant and it is possible to form a queue. We assume that the service times are exponentially
6
distributed of parameter μ. It is assumed that the traffic intensity is = , where it is well-known
5
that this implies that the system does not work properly (the queue increases indefinitely). Compare
the advantages of the following two possibilities:
1) Another shop assistant is hired (of the same service time distribution as the first one).
2) Improvement of the service, such that the average service time is lowered to its half.
We have a queueing system with possibility of forming a queue. The parameters are
6
N = 1, = and λ, μ.
5
6
Since = > 1, this system does not work properly.
5
1) If another shop assistant is hired, then the parameters are changed to
3
N = 2, = and λ, μ unchanged.
5
Then
1− 1
p0 = = .
1+ 4
The average waiting time is
2
1 3
· ·2
4 5 9 1
V1 = 2 = · ,
2 16 μ
μ·2
5
53
Stochastic Processes 2 4. Queueing theory
Remark 4.1 It should here be added that one can also find
15
the average number of customers = ,
8
6
the average number of busy shop assistants = ,
5
27
the average length of the queue = . ♦
40
The customer will prefer that the sum of waiting time and service time is as small as possible. Since
34 1 32 1
V 1 + O1 = · and V 2 + O2 = · ,
16 μ 16 μ
it follows that the customer will prefer the latter system, while it is far more uncertain what the shop
would prefer, because we do not know the costs of each of the two possible improvements.
54
Stochastic Processes 2 4. Queueing theory
Example 4.3 We consider an intersection which is not controlled by traffic lights. One has noticed
that cars doing a left-hand turn are stopped and therefore delay the cars which are going straight on.
Therefore, one plans to build a left-hand turn lane. Assuming that arrivals and departures of the cars
λ 1
doing the left-hand turn are exponentially distributed with the parameters λ and μ, where = , one
μ 2
shall compute the smallest number of cars of the planned left-hand turn lane, if the probability is less
than 5 % of the event that there are more cars than the new lane can contain.
1 1
thus < , which is fulfilled for n ≥ 4.
2n 10
Example 4.4 Given a queueing system of exponential distribution of arrivals and exponential distri-
1 1
bution of service times (the means are called and , resp.). The number of service places is 2. We
λ μ
1 1
furthermore assume that it is possible to form a queue. Assuming that = 1 (minute) and =1
λ μ
(minute),
1. find the average waiting time,
2. find the average staying time.
For economic reasons the number of service places is cut down from 2 to 1, while the service at the
same time is simplified (so the service time is decreased), such that the customer’s average staying
time is not prolonged. Assuming that the constant λ is unchanged,
1
3. find the average service time , such that the average staying time in the new system is equal to
μ1
the average staying time in the previous mentioned system,
4. find in which of the two systems the probability is largest for a customer to wait.
1 1
Here N = 2, = 1 and = 1. This gives the traffic intensity
λ μ
λ 1 1 1− 1
= = = , and p0 = = .
Nμ 2·1 2 1+ 3
55
Stochastic Processes 2 4. Queueing theory
1
1 2 1
p0 · N · N N −1 3 · 2 ·2 1
V = = = minute.
1 2
μ · N !(1 − )2 1 · 2! 1 − 2 3
2) The staying time is the waiting time plus the serving time, so the average staying time is
1 1 4
O=V + = + 1 = minute.
μ 3 3
λ 1
1 = = , idet N1 = 1.
N1 μ1 μ1
The average waiting time is for N1 given by some theoretical formula,
1 1
V1 = = ,
μ1 (1 − 1 ) μ1 (μ1 − 1)
your chance
to change
the world
Please click the advert
56
Stochastic Processes 2 4. Queueing theory
4 3 7
We want that O1 = O = . Hence, μ1 − 1 = , i.e. μ1 = , and
3 4 4
1 4
= .
μ1 7
www.maersk.com/mitas
57
Stochastic Processes 2 4. Queueing theory
Assuming that the probability of rejection is too large, we change the system, such that there are two
shop assistants A and B, and the service is changed, such that a customer at his arrival goes to A
and is served by him, if A is vacant at the arrival of the customer. If on the other hand A is busy,
then the customer will turn to B in order to be serviced. If also B is busy, the customer is rejected.
The assumptions of the arrivals and service times are the same as before. We want to compute in this
system:
4. The stationary probabilities and the probability of rejection.
58
Stochastic Processes 2 4. Queueing theory
and hence
1 λ
c1 = , c2 = ,
λ+μ λ+μ
and the solution becomes
⎧
⎪ μ λ
⎪
⎪ P (t) = + e−(λ+μ)t ,
⎨ 0 λ+μ λ+μ
⎪
⎪
⎪
⎩ P1 (t) =
λ
−
λ
e−(λ+μ)t .
λ+μ λ+μ
λ
3) If = 6, then
μ
λ
λ μ 6 μ 1
= λ
= and = ,
λ+μ μ +1 7 λ+μ 7
59
Stochastic Processes 2 4. Queueing theory
P0 (t) · {1 − λh + hε(h)}.
(ii) E0 , some customer arrive, and they are served until they are finished,
hε(h).
(iii) E1 , and there is no customer coming, and A’s customer is serviced to the end,
Choose Accenture for a career where the variety of opportunities and challenges allows you to make a
difference every day. A place where you can develop your potential and grow professionally, working
alongside talented colleagues. The only place where you can learn from our unrivalled experience, while
helping our global clients achieve high performance. If this is your idea of a typical working day, then
Accenture is the place to be.
It all starts at Boot Camp. It’s 48 hours packed with intellectual challenges and intense learning experience.
that will stimulate your mind and and activities designed to let you It could be your toughest test yet,
enhance your career prospects. You’ll discover what it really means to be a which is exactly what will make it
spend time with other students, top high performer in business. We can’t your biggest opportunity.
Accenture Consultants and special tell you everything about Boot Camp,
guests. An inspirational two days but expect a fast-paced, exhilarating Find out more and apply online.
Visit accenture.com/bootcamp
60
Stochastic Processes 2 4. Queueing theory
hε(h).
(v) E2 , and no new customer is coming, and B’s customer is served to the end,
hε(h).
(vii) E3 in general,
hε(h).
Then compute the derivative in the usual way by taking the limit. This gives
P0 (t) = lim {P0 (t + h) − P0 (t)} = −λP0 (t) + μ {P1 (t) + P2 (t)} .
h→0
hence
6p0 = p1 + p2 .
We are still missing one equation, when we want to find the stationary probabilities. We choose
to realize P3 (t + h). This can be done, if the system at time t is in state
(i) E0 , and at least two customers arrive,
hε(h).
(ii) E1 , and at least one customer arrives, and neither A nor B finish their customers,
(iii) E2 , and at least one customer arrives, and neither A nor B finish their customers,
P3 (t) · {1 − μh + hε(h)}2 .
hε(h).
61
Stochastic Processes 2 4. Queueing theory
Then divide by h and let h → 0. This will give us the differential equation
so
p3 = 3 (p1 + p2 ) = 18p0 .
1
By addition of the former three equations, we get 25p0 = 1, thus p0 = . Then
25
6 18 6 24
p1 = − = (25 − 21) = ,
7 25 175 175
and
1 1 18 18
p2 = − = , and p3 = ,
7 25 175 25
so
1 24 18 18
(p0 , p1 , p2 , p3 ) = , , , ,
25 175 175 25
62
Stochastic Processes 2 4. Queueing theory
6) We have in the general case of N shop assistants, where Ej denotes that j customers are served,
the system of differential equations
⎧
⎪
⎪ P0 (t) = −λP0 (t) + μP1 (t),
⎪
⎪
⎨
Pk (t) = −(λ + kμ)Pk (t) + λPk−1 (t) + (k + 1)μPk+1 (t), 1 ≤ k ≤ N − 1,
⎪
⎪
⎪
⎪
⎩
PN (t) = −N μPN (t) + λPN −1 (t).
λ
Since = 6, we get by a division by μ, followed by a rearrangement that
μ
⎧
⎪
⎪ 0 = 6p0 − p1 ,
⎪
⎪
⎨
6pk − (k + 1)pk+1 = 6pk−1 − kpk , 1 ≤ k ≤ N − 1,
⎪
⎪
⎪
⎪
⎩
0 = 6pN −1 − N pN .
kpk = 6pk−1 , 1 ≤ k ≤ N.
(k − 1)!
= 0,
6k
and then do the recursion,
k! (k − 1)! 0!
k
pp = k−1
pk−1 = · · · = 0 p0 = p0 , k = 0, 1, . . . , N,
6 6 6
thus
6k
pk = p0 , k = 0, 1, . . . , N.
k!
63
Stochastic Processes 2 4. Queueing theory
k 0 1 2 3 4
6k
1 6 18 36 54
k! j
k−1 6
j=0
1 7 25 61
j!
1
It follows that N ≥ 4 gives pN ≤ , so we shall at least apply 4 service places.
2
Please click the advert
In Paris or Online
International programs taught by professors and professionals from all over the world
Visit: www.HorizonsUniversity.org
Write: [email protected]
Call: 01.42.77.20.66 www.HorizonsUniversity.org
64
Stochastic Processes 2 4. Queueing theory
Example 4.6 At a university there are two super computers A and B. Computer A is used for
university tasks, while computer B is restricted to external tasks. Both systems allow forming queues,
and the service times (i.e. the times used for computation of each task) is approximately exponentially
1
distributed of mean = 3 minutes. The university tasks arrive to computer A approximately as a
μ
1
Poisson process of intensity λA = min−1 , while the tasks of computer B arrive as a Poisson process
5
3 −1
of intensity λB = min . Apply the stationary probabilities for the two computers A and B to
10
compute
It is suggested to join the two systems to one, such that each computer can be used to university tasks
as well external tasks. This means that we have a queueing system with two “shop assistants”. Use
again the stationary probabilities of this system to compute
1) In both cases, N = 1.
For A we have the capacity
λA 3 2
A = = , thus p0,A = 1 − A = .
N μA 5 5
For B we have the capacity
λB 9 1
B = = , thus p0,B = 1 − B = .
N μB 10 10
These probabilities indicate the fraction of time, in which the given computer is vacant.
2) Since N = 1, the respective average waiting times are
3
A 9
VA = =3· 5 3 = minutes,
μ (1 − A ) 1− 5
2
and
9
B
VB = = 3 · 10 9 = 27 minutes.
μ (1 − B ) 1 − 10
3) The sum of two Poisson processes is again a Poisson process, here with the parameter
1 3 1
λ = λ A + λB = + = .
5 10 2
65
Stochastic Processes 2 4. Queueing theory
66
Stochastic Processes 2 4. Queueing theory
Example 4.7 Given a birth and death process of the states E0 , E1 , E2 , . . . , where the birth intensity
λk in state Ek decreases in increasing k as follows,
α
λk = ,
k+1
where α is a positive constant, while the death intensities μk are given by
⎧
⎨ μ, k ∈ N,
μk = where μ > 0.
⎩
0, k = 0,
α
1) The system of differential equations for λk = and μ > 0 is given by
k+1
⎧
⎪
⎪ P (t) = −αP0 (t) + μP1 (t),
⎨ 0
⎪
⎪ α α
⎩ Pk (t) = − + μ Pk (t) + Pk−1 (t) + μ Pk+1 (t), k ∈ N.
k+1 k
thus
α α
− pk + μpk+1 = − pk−1 + μpk = · · · = 0, k ∈ N,
k+1 k
and hence
α
μpk = pk−1 , k ∈ N.
k
67
Stochastic Processes 2 4. Queueing theory
μk−1
k! = 0,
αk
it follows by a recursion that
μ k μ k−1 μ 0
k! pk = (k − 1)! pk−1 = · · · = 0! p0 = p 0 ,
α α α
hence
k
1 α
pk = p0 , k ∈ N0 .
k! μ
It follows from
∞
∞
k
1 α α
1= pk = p 0 = p0 exp ,
k! μ μ
k=0 k=0
that
α
p0 = exp − ,
μ
thus
k
1 α α
pk = exp − , k ∈ N0 .
k! μ μ
2) Put α = μ. The probability that there are at most 3 customers in the system is
1 1 1 1 16
p0 + p1 + p2 + p3 = 1+ + + = ≈ 0.9810.
e 1! 2! 3! 6e
3k pk = 3k−1 pk−1 = · · · = 30 p0 = 00 ,
68
Stochastic Processes 2 4. Queueing theory
hence
1
pk = p0 , k ∈ N0 .
3k
It follows from
∞
∞ k
1
1 3 3
1= pk = p 0 = p0 · 1 = · p0 ,
k=0 k=0
3 1− 3
2
2
that p0 = , and the probability that there are at most three customers in this system is
3
1 1 1 2 27 + 9 + 3 + 1 80
p0 + p1 + p2 + p3 = p0 1 + + 2 + 3 = · = ≈ 0.9877.
3 3 3 3 27 81
There is a slightly higher probability in this case that there are at most three customers in this
system than in the system which was considered in 2..
Please click the advert
www.simcorp.com
69
Stochastic Processes 2 4. Queueing theory
Example 4.8 Given the following queueing model: M machines are working mutually independently
of each other and they need no operation by men, except in the case when they break down. There are
in total N service mechanics (where N < M ) for making repairs. If a machine is working at time t,
it is the probability λh + hε(h) that it breaks down before time t + h, and probability 1 − λh + hε(h)
that it is still working. Analogously, if it is repaired at time t, then there is the probability μh + hε(h)
that it is working again before t + h, and probability 1 − μh + hε(h) that it is not working. When a
machine breaks down, it is immediately repaired by a service mechanic, if he is vacant. Otherwise, the
machine is waiting in a queue, until a service mechanic becomes vacant. We define the coefficient of
loss of a machine as
1
· average number of machines in the queue,
M
and the coefficient of loss of a service mechanic as
1
· average number of vacant service mechanics.
N
Denote by Ek the state that k machines do not work, k = 0, 1, . . . , M .
λk = (M − k)λ, μk = kμ, 0 ≤ k ≤ N,
λk = (M − k)λ, μk = N μ, N ≤ k ≤ M.
4) Find the probability that there are precisely 0, 1, 2, . . . , N vacant service mechanics.
5) Find the coefficients of loss of a machine and a service mechanics in the case of
λ
= 0, 1; M = 6; N = 1.
μ
It should be mentioned for comparison that in the case when
λ
= 0, 1; M = 20; N = 3,
μ
the coefficient of loss of a machine is 0.0169 and the coefficient of loss of a service mechanics is
0.4042. Which one of the two systems is best?
70
Stochastic Processes 2 4. Queueing theory
1) Let 0 ≤ k ≤ M , and assume that we are in state Ek , thus k machines are being repaired or
are waiting for reparation, and M − k machines are working. The latter machines have each the
probability
λh + hε(h)
of breaking down in the time interval ]t, t + h] of length h. Since M − k machines are working, we
get
λk = (M − k)λ for 0 ≤ k ≤ M.
If we are in state Ek , where 0 ≤ k ≤ N , then all k machines are being repaired. Each of these
have the probability
μh + hε(h)
for being repaired before time t + h, thus
μk = kμ, for 0 ≤ k ≤ N.
If instead N < k ≤ M , then all service mechanics are working, so
μk = N μ, for N < k ≤ M.
2) By a known formula,
μk+1 pk+1 = λk pk ,
thus
λk
pk+1 = pn , for n = 0, 1, . . . , M − 1.
μk+1
When we insert the results of 1., we get
⎧
⎪ (M − k)λ
⎪
⎪ p = pk for k = 0, 1, . . . , N − 1,
⎨ k+1 (k + 1)μ
⎪
⎪
⎩ pk+1 = (M − k)λ pk
⎪
for k = N, . . . , M − 1.
Nμ
When the first equation is multiplied by
1 μ k+1
,
M λ
k+1
we get
M
pk+1 (M − k)λ k 1 pk
k+1 = · · ·
M λ (k + 1)μ M λ M
k+1 μ k+1 μ k
pk p0
= k = · · · = 0 = p0 ,
M λ M λ
k μ 0 μ
71
Stochastic Processes 2 4. Queueing theory
hence
k
M λ
pk = p0 for k = 0, 1, . . . , N.
k μ
We put n = N + m, m = 0, 1, . . . , M − N − 1, into the second equation. Then
m+1
M −N −m λ 1 λ
pN +m+1 = · pN +m = m+1 · (M − N − m) · · · (M − N )pN
Mμ μ N μ
m+1
1 λ (M − N )!
= · pN ,
N m+1 μ (M − N − m − 1)!
hence
m N +m
1 λ (M − N )! M! 1 λ
pN +m = · pN = · m p0 ,
Nm μ (M − N − m)! N !(M − N − m)! N μ
for m = 0, 1, . . . , M − N .
3) The average number of machines in the queue is
M
M
(k − N )pk = (k − N )pk .
k=N +1 k=N
72
Stochastic Processes 2 4. Queueing theory
4) If there are n ∈ {1, 2, . . . , N } vacant service mechanics, the system is in state E N −n , so the
probability is
N −n
M λ
pN −n = p0 , n = 1, 2, . . . , N.
N −n μ
λ 1
5) If = , M = 6 and N = 1, then the coefficient of loss of the machine is by 3. given by
μ 10
1 μ 1 11
· M − 1+ (1 − p0 ) = 1 − (1 + 10) · (1 − p0 ) = 1 − (1 − p0 ) .
M λ 6 6
We shall only find p0 . We get by using the recursion formulae
6 5 4
p1 = p0 , p2 = p1 , p3 = p2 ,
10 10 10
3 2 1
p4 = p3 , p5 = p4 , p6 = p5 ,
10 10 10
hence
6
6 5 4 3 2 1
1 = pk = p 0 1+ 1+ 1+ 1+ 1+ 1+
10 10 10 10 10 10
k=0
≈ p0 · 2.0639,
so
p0 ≈ 0.4845.
73
Stochastic Processes 2 4. Queueing theory
1
Example 4.9 In a shop the service time is exponentially distributed of mean , thus the frequency
μ
is given by
⎧
⎨ μ e−μx , x > 0,
f (x) =
⎩
0, x ≤ 0.
Let X1 , X2 , . . . denote the service times of customer number 1, 2, . . . . We assume that the X i are
mutually independent and that they all have the frequency f (x) above.
In total there arrive to the shop N customers, where N is a random variable, which is independent of
all the Xi , and N can have the values 1, 2, . . . , of the probabilities
P {N = k} = p q k−1 , k ∈ N,
N
2) Find the frequency and the distribution function of Y = i=1 Xi by using that
∞
P {Y ≤ x} = P {N = k ∧ Yk ≤ x} .
k=1
1
1) Since Xi ∈ Γ 1, , it follows that
μ
n
1
Yn = Xk ∈ Γ n, ,
μ
k=1
74
Stochastic Processes 2 4. Queueing theory
1 1
E{X} = og V {X} = .
pμ p2 μ2
Please click the advert
75
Stochastic Processes 2 4. Queueing theory
Example 4.10 An old-fashioned shop with one shop assistant to serve the customers can be con-
sidered as a queuing system of one channel with the possibility of forming a queue. The customers
arrive according to a Poisson process of intensity λ, and the service time is exponentially distributed
of parameter μ. It has been noticed that when the system is in its equilibrium, then the shop assistant
3
is in mean busy of the time, and the average staying time of customers is 10 minutes.
4
1 1 1 1
1. Prove that = hour and = hour.
λ 18 μ 24
2. Find the probability that a customer is served immediately.
3. Find the average queue length.
The shop is closed at 1730 and only the customers who are already in the shop are served by the shop
assistant, before he leaves for his home.
4. Find the probability that there at 1730 are 0, 1, 2, . . . customers in the shop.
5. Led the random variable T denote the time from 1730 until the shop assistant has served all cus-
tomers. Find the distribution of T .
μpk+1 = λpk , n ∈ N0 .
2) A customer is immediately served if the system is in state E0 . The probability of this event is
3 1
p0 = 1 − = 1 − = .
4 4
76
Stochastic Processes 2 4. Queueing theory
5) Assume
that there are k customers in the shop. Then the service time is Erlang distributed,
1
Γ k, , of frequency
μ
(μx)k−1 −μx
μ· e , x > 0, k ∈ N.
(k − 1)!
Then by an integration,
⎧ μ
⎪ 3
⎨ 1 − exp − x , x ≥ 0,
P {T ≤ x} = 4 4
⎪
⎩
0, x < 0.
LT (λ) = P (L(λ)),
where
μ
L(λ) =
λ+μ
77
Stochastic Processes 2 4. Queueing theory
and
∞
∞ k
k 1 3 1 1 1
P (s) = pk s = s = · = .
4 4 4 3 4 − 3s
k=0 k=0 1− s
4
Hence by insertion,
1
1 λ+μ 1 3 μ
LT (λ) = = = ·1+ · 4 .
3μ 4λ + μ 4 4 1
4− λ+ μ
λ+μ 4
We recognize this Laplace transform as corresponding to
⎧ μ
⎪ 3
⎨ 1 − exp − x , x ≥ 0,
FT (x) = 4 4
⎪
⎩
0, x < 0.
Try this...
Please click the advert
78
Stochastic Processes 2 4. Queueing theory
79
Stochastic Processes 2 4. Queueing theory
λk = 2, μ1 = 1, μ2 = 2 and μk = 3 for k ≥ 3.
and
k 3 k−1
2 3 2
pk = · p0 = 2 p0 for k ≥ 3.
3 3! 3
The sum is
∞
∞ k−2
∞ k−2
2 2
1= pk = p 0 1+2+2+2 = p0 3+2 = 9p0 ,
3 3
k=0 k=3 k=2
1
from which p0 = . The waiting time is obtained by insertion,
9
3 3
2 2 2
· 3 2
p0 N · N N −1 1 3 3 2 4
V = 2
= · 2 = 2 = = .
μ · N !(1 − ) 9 2 1 3 9
1 · 3! 1 − 3·2
3 3
80
Stochastic Processes 2 4. Queueing theory
2. Find the fraction of time, in which all three channels are busy.
3. Compute the average length of the queue.
Decrease the number of channels to two while the other assumptions are unchanged. Compute in this
system,
4. the stationary probabilities,
5. the fraction of time, in which both channels are busy,
6. the average length of the queue.
Finally, decrease the number of channels to one, while the other assumptions are unchanged.
7. How will this system function?
81
Stochastic Processes 2 4. Queueing theory
hence
⎧ ⎫
∞ k−3
⎪ ⎪
∞
1 1 1 ⎨5 1 1 ⎬ 5 1 11
1= pk = p 0 1+1+ + = p0 + · = p0 + = p0 ,
2 6 3 ⎪
⎩2 6 1⎪ 2 4 4
k=0 k=3 1− ⎭
3
4
from which p0 = , thus
11
⎧ 4 1
⎪
⎪ · , k = 0, 1, 2,
⎪
⎨ 11 k!
pk = k−3
⎪
⎪
⎪
⎩
2 1
, k ≥ 3.
33 3
2) The fraction of time, in which all three channels are busy, is given by
∞
∞ k−3
2 1 2 1 2 3 1
pk = = · = · = .
33 3 33 1 33 2 11
k=3 k=3 1−
3
Alternatively, it is given by
4 4 1 4 1 1
1 − p 0 − p1 − p2 = 1 − − − = .
11 11 1! 11 2! 11
Fast-track
your career
Please click the advert
82
Stochastic Processes 2 4. Queueing theory
1
4) If N = 2, then = . The stationary probabilities are
2
⎧ 1
⎪
⎪ 1− 2 1
⎪
⎪ = , k = 0,
⎪
⎨ +
1 1 3
3
pk =
⎪
⎪ k 1
⎪
⎪ 1 1− 1 1
⎪
⎩ 2 · 2
1 = · , k ∈ N.
2 1+ 2
3 2k−1
83
Stochastic Processes 2 4. Queueing theory
7) If there is only one channel, the traffic intensity becomes = 1, and the queue is increasing
indefinitely.
Example 4.13 A shop serves M customers, and there is one shop assistant in the shop. It is possible
1
to form a queue. We assume that the service time is exponentially distributed of mean . Assume
μ
also that if a customer is not in the shop at time t, then there is the probability λh + hε(h) [where λ
is a positive constant] that this customer arrives to the shop before the time t + h. Finally, assume
that the customers arrive to the shop mutually independent of each other. Thus we have a birth and
death process {X(t), t ∈ [0, ∞[} of the states E0 , E1 , . . . , EM , where Ek denotes the state that there
are k customers in the shop, k = 0, 1, 2, . . . , M .
1) Prove that the birth intensities λk and death intensities μk , k = 0, 1, 2, . . . , M , are given by
⎧
⎨ 0, k = 0,
λk = (M − k)λ, μk =
⎩
μ, k = 1, 2, . . . , M.
1) If we are in state Ek , then M − k of the customers are not in the shop. They arrive to the shop
before time t + h of probability
(M − k){λ + ε(h)}h,
(a time interval of length h, and we divide by h before we go to the limit h → 0). Hence, the birth
intensity is
λk = (M − k)λ, k = 0, 1, . . . , M.
If we are in state E0 , then no customer is served, so μ0 = 0.
In any other state precisely one customer is served with the intensity μ, so
μk = μ, k = 1, 2, . . . , M.
84
Stochastic Processes 2 4. Queueing theory
μk+1 pk+1 = λk pk .
3) We get successively
2
λ λ
p0 = p 0 , p 1 = M · p0 , p2 = M (M − 1) p0 ,
μ μ
and in general
k
M! λ
pk = p0 , k = 0, 1, 2, . . . , M.
(M − k)! μ
that
μ M
1 λ
p0 = M −k =
M 1 μ k
,
M 1 λ M ! k=0
M! k=0 k! λ
k! μ
and hence
μ M
M
2 k
λ λ λ M! λ λ
1, M , M (M − 1) ,··· , , · · · , M!
1 μ
p =
M k μ μ (M − k)! μ μ
M! k=0
k! λ
⎛ ⎞
μ M μ M −1 μ M −k
1 ⎜ λ ⎟
⎝ , λ ,··· , λ , · · · , 1⎠ .
1 μ
=
M k M! (M − 1)! (M − k)!
M! k=0
k! λ
5) The average number of customers who are not in the shop is by e.g. 3.,
M
M
−1 k M
k−1
M! λ M! λ
(M − k)pk = p0 = p0
(M − k − 1)! μ (M − k)! μ
k=0 k=0 k=1
M
μ μ
= pk = (1 − p0 ) .
λ λ
k=1
85
Stochastic Processes 2 4. Queueing theory
λ
6) If = 1 and M = 5, then
μ
5
5!
1= p0 = {1 + 5 + 20 + 60 + 120 + 120}p0 = 326p0 ,
(5 − k)!
k=0
and
1
p= (1, 5, 20, 60, 120, 120).
326
λ 1
7) Når = og M = 5, er
μ 2
5
k
5! 1 5 15 15 15 109
1= p0 = 1 + + 5 + + + p0 = p0
(5 − k)! 2 2 2 2 4 4
k=0
and
4 5 15 15 15 1
p= 1, , 5, , , = (4, 10, 20, 30, 30, 15).
109 2 2 2 4 109
Looking for a career where your ideas could really make a difference? UBS’s
Graduate Programme and internships are a chance for you to experience
for yourself what it’s like to be part of a global team that rewards your input
and believes in succeeding together.
Wherever you are in your academic career, make your future a part of ours
by visiting www.ubs.com/graduates.
www.ubs.com/graduates
86
Stochastic Processes 2 4. Queueing theory
Example 4.14 Given two queueing systems, A and B, which are mutually independent. We assume
for each of the two systems:
a. there is one channel,
b. it is possible to form a queue,
c. the customers arrive according to a Poisson process of intensity λ,
d. the service times are exponentially distributed of parameter μ,
λ 1
e. the traffic intensity is = = .
μ 2
Denote by X1 the random variable which indicates the number of customers in system A, and by X 2
the random variables which indicates the number of customers in system B.
1. Compute by using the stationary probabilities,
87
Stochastic Processes 2 4. Queueing theory
1
1) The two queueing systems follow the same distribution, and N = 1 and = , so we get by a
2
known formula,
k+1
1
P {X1 = k} = P {X2 = k} = pk = k · (1 − ) = , k ∈ N0 .
2
k
k j+1 k−j+1
1 1
P {Z = k} = P {X1 = j} · P {X2 = k − j} = ·
j=0 j=0
2 2
k+2
1
= (k + 1) · , k ∈ N0 .
2
3) It follows from
∞ k+1 ∞ k−1
1 1 1 1 1
E {X1 } = E {X2 } = k = k· = · = 1,
2 4 2 4 1− 1 2
k=1 k=1 2
that
∞
k+2 ∞ k+1 ∞ k−2
1 1 1 1
E{Z} = k(k + 1) = k(k − 1) = k(k − 1)
2 2 8 2
k=1 k=2 k=2
1 2!
= · 3 = 2.
8 1
1−
2
1
4) Roughly speaking, A and B are joined to get C, so we have N = 2 and = . Then it follows that
2
1− 1
P {Y = 0} = p0 = = ,
1+ 3
and
k−1
1− 1 1
P {Y = k} = 2k · = , k ∈ N.
1+ 3 2
Thus
∞
j−1 k k−1
1 1 1 1 1 1 1
P {Y > k} = = · · = · , k ∈ N0 .
3 2 3 2 1 3 2
j=k+1 1−
2
5) The mean is
∞
k−1
1 1 1 1 4
E{Y } = k = · 2 = .
3 2 3 1 3
k=1 1−
2
88
Stochastic Processes 2 4. Queueing theory
We notice that P {Y = k} = P {Y > k} for k ∈ N, and that this is not true for k = 0.
Please click the advert
89
Stochastic Processes 2 4. Queueing theory
Example 4.15 Given two mutually independent queueing systems A and B. We assume for each of
the two systems,
a. there is one channel,
b. it is possible to form a queue,
1
c. customers arrive to A according to a Poisson process of intensity λA = minute−1 , and they
3
2
arrive to B according to a Poisson process of intensity λB = minute−1 ,
3
d. the service times of both A and B are exponentially distributed of the parameter μ = 1 minute −1 .
Let the random variable XA denote the number of customers in system A, and let the random variable
XB denote the number of customers in system B. Furthermore, we let Y A and YB , resp., denote the
number of customers in the queue at A and B, resp..
1. Find by using the stationary probabilities,
P {XA = k} and P {XB = k} , k ∈ N0 .
1 1
1A. Since λA = minute−1 and μ = 1 minute−1 , and N = 1, we get the traffic intensity A = .
3 3
The stationary probabilities are
k+1
1
P {XA = k} = pA,k = 2 · , k ∈ N0 .
3
2 2
1B. Analogously, λB = minute−1 and μ = 1 minute−1 , and N = 1, so B = , and
3 3
k k+1
1 2 1 2
P {XB = k} = pB,k = = , k ∈ N0 .
3 3 2 3
90
Stochastic Processes 2 4. Queueing theory
3A. Assume that there is no queue at A. Then either there is no customer at all in the system, or
there is precisely one customer, who is served for the time being,
1 1 8
P {YA = 0} = P {XA = 0} + P {XA = 1} = 2 · + = .
3 9 9
If k ∈ N, then
k+2
1
P {YA = k} = P {XA = k + 1} = 2 · .
3
3B. Analogously,
1 2 5
P {YB = 0} = P {XB = 0} + P {XB = 1} = 1+ =
3 3 9
and
k+2
1 2
P {YB = k} = P {XB = k + 1} = , k ∈ N0 .
2 3
4. It follows from
∞
k+1 ∞ k−1
1 2 1 2 1 1
E {XA } = 2 k = k = · 2 =
3 9 3 9 1 2
k=1 k=1 1−
3
and
∞ k−1
2 2 2 1
E {XB } = k = · 2 = 2,
9 3 9 2
k=1 1−
3
that
1 5
E {XA + XB } = +2= .
2 2
It follows from
∞
k+2 ∞ k−1
1 2 1 2 1 1
E {YA } = 2 k = k = · 2 =
3 27 3 27 1 6
k=1 k=1 1−
3
91
Stochastic Processes 2 4. Queueing theory
and
∞ k+2 ∞ k−1
1 2 4 2 4 1 4
E {YB } = k = k = · 2 = ,
2 3 27 3 27 2 3
k=1 k=1 1−
3
then
1 4 3
E {YA + YB } = + = .
6 3 2
5. If k ∈ N0 , then
k
P {XA + XB = k} = P {XA = j} · P {XB = k − j}
j=0
k
j+1 k−j+1 k k+2
1 1 2 1
= 2· · · = · 2k−j+1
j=0
3 2 3 j=0
3
k+2 k+1
k+2
1 1
= 2n = 2k+2 − 2 ·
3 n=1
3
k+2 k+2
2 1 2
= −2 = k+2 2k+1 − 1 .
3 3 3
7. By a straightforward computation,
∞ k ∞ k−1
2 1 1 1 1 1 4
E{X} = k = k = · 2 =
3 2 3 2 3 1 3
k=1 k=1 1−
2
92
Stochastic Processes 2 4. Queueing theory
and
∞ k
1 1 1 1
E{Y } = k = E{X} = .
6 2 4 3
k=1
Example 4.16 Consider a birth and death process E0 , E1 , E2 , . . . , where the birth intensities λk
are given by
α
λk = , k ∈ N0 ,
k+1
where α is a positive constant, while the death intensities μk are given by
⎧
⎨ 0, k = 0,
μk = μ, k = 1,
⎩
2μ, k ≥ 2,
α
where μ > 0. We assume that = 8.
μ
1. Find the equations of the stationary probabilities pk , k ∈ N0 .
2. Prove that
1
pk = 2 · 4k · p0 , k ∈ N,
k!
and find p0 .
The above can be viewed as a model of the forming of a queue in a shop, where
a. there are two shop assistants,
1
b. the service time is exponentially distributed of mean ,
μ
c. the frequency of the arrivals is decreasing with increasing number of customers according to the
indicated formula.
3. Compute by means of the stationary probabilities the average number of customers in the shop. (3
dec.).
4. Compute by means of the stationary probabilities the average number of busy shop assistants. (3
dec.).
5. Compute by means of the stationary probabilities the probability that there are more than two
customers in the shop. (3 dec.).
1) We have
μk+1 pk+1 = λk pk , k ∈ N0 ,
93
Stochastic Processes 2 4. Queueing theory
thus
λ0 α
p1 = p0 = p0 = 8p0
μ1 μ
and
λk−1 α 1 4
pk = pk−1 = · pk−1 = pk−1 for k ≥ 2.
μk k 2μ k
2) If k = 1, then
41
p1 = 8p0 = 2 · p0 ,
1!
and the formula is true for k = 1. Then assume that
1
pk−1 = 2 · 4k−1 · p0 .
(k − 1)!
Then
4 4k
pk = pk−1 = 2 · p0 ,
k k!
and the formula follows by induction.
your chance
to change
the world
Please click the advert
94
Stochastic Processes 2 4. Queueing theory
It follows from
∞
∞
4k
1= pk = p 0 1+2 = p0 2e4 − 1
k!
k=0 k=1
that
1
p0 = .
2e4 − 1
3) The task is now changed to queueing theory. Since pk is the probability that there are k customers
in the shop, the mean of the number of customers in the shop is
∞
∞
4k−1 8e4
kpk = 2 · 4 · p0 = 4 ≈ 4.037.
(k − 1)! 2e − 1
k=1 k=1
5) The probability that there are more than two customers in the shop is
∞
32 25
pk = 1 − p 0 − p 1 − p 2 = 1 − p 0 1 + 8 + =1− 4 ≈ 0.769.
2 2e − 1
k=3
95
Stochastic Processes 2 4. Queueing theory
Example 4.17 Consider a birth and death process of the states E0 , E1 , E2 , . . . , where the birth
intensities λk are given by
2λ, k = 0,
λk =
λ, k ∈ N,
λ 3
Here, λ and μ are positive constants, and we assume everywhere that = .
μ 4
1. Find the equations of the stationary probabilities, and prove that the stationary probabilities are
given by
k
3
pk = 2 · p0 , k = 1, 2, 3, . . . ,
4
and finally, find p0 .
The above can be considered as a model of forming queues in a shop, where
a. there is one shop assistant,
1
b. the service time is exponentially distributed of mean ,
μ
c. the customers arrive according to a Poisson process of intensity 2λ. However, if there already are
customers in the shop, then half of the arriving customers will immediately leave the shop without
being served.
2. Compute by means of the stationary probabilities the average number of customers in the shop.
3. Compute by means of the stationary probabilities the average number of customers in the queue.
We now assume that instead of one shop assistant there are two shop assistants and that all arriving
customers are served (thus we have the birth intensities λk = 2λ, k ∈ N0 ).
4. Compute in this queueing system the stationary probabilities and then find the average number of
customers in the queue.
μk+1 pk+1 = λk pk , k ∈ N0 ,
thus
1
2λ 3 3
p1 = p 0 = p0 = 2 · p0 ,
μ 2 4
and
λ 3
pk = pk−1 = pk−1 , k ≥ 2,
μ 4
96
Stochastic Processes 2 4. Queueing theory
hence by recursion,
k−1 k
3 3
pk = p1 = 2 · p0 , k ≥ 2.
4 4
We get
⎧ ⎫
∞ ∞ k ⎪
⎨ ⎪
⎬
3 3 1 3
1= pk = p 0 + p 0 2· = p0 1 + 2 · · = p0 1 + · 4 = 7p0 ,
4 ⎪
⎩ 4 3⎪ 2
k=0 k=1 1− ⎭
4
so
k
1 2 3
p0 = and pk = · , k ∈ N.
7 7 4
2) Since pk is the probability that there are k customers in the shop, the average number of customer
in the shop is
∞
∞ k−1
2 3 3 3 1 3 24
kpk = · k· = · 2 = · 16 = .
7 4 4 14 3 14 7
k=1 k=1 1−
4
3) If there are k customers in the queue, there must also be 1 customer, who is being served, so the
average is
∞
∞ k−1
2 3 3 3 3 18
kpk+1 = · · k = · 24 = ,
7 4 4 4 28 7
k=1 k=1
We see that they are identical with the stationary probabilities found in 1..
The average length of the queue is given by (end here we get to the divergence from the previous
case)
∞
∞ k ∞ k+2 3 ∞ k−1
2 3 2 3 2 3 3
(k − 2)pk = (k − 2) = k = · k
7 4 7 4 7 4 4
k=3 k=3 k=1 k=1
3
2 3 1 2 27 27
= · · 2 = · = .
7 4 3 7 4 14
1−
4
97
Stochastic Processes 2 4. Queueing theory
Example 4.18 Consider a birth and death process of states E0 , E1 , E2 , . . . , and with birth intensities
λk given by
⎧
⎪
⎨ α, k = 0, 1,
λk =
⎪
⎩ α, k ≥ 2,
k
where α is a positive constant, and where the death intensities are given by
0, k = 0,
μk =
μ, k ∈ N,
where μ > 0.
α
We assume in the following that = 2.
μ
1. Find the equations of the stationary probabilities pk , k ∈ N0 .
2. Prove that
2k
pk = p0 , k ∈ N,
(k − 1)!
and find p0 .
The above can be considered as a model of forming a queue in a shop where
a. there is one shop assistant,
1
b. the serving time is exponentially distribution of mean ,
μ
c. the frequency of arrivals decreasis with increasing number of customers according to the formula
for λk above.
3. Compute by means of the stationary probabilities the average length of the queue (3 dec.).
4. Compute by means of the stationary probabilities the average number of customers in the shop (3
dec.).
1) We have
∞
μk+1 pk+1 = λk pk , k ∈ N0 , and pk = 1.
k=0
Hence, successively,
α
μp1 = αp0 , μp2 = αp1 , and μpk = pk−1 for k ≥ 3.
k−1
α
It follows from = 2 that
μ
∞
2
(6) p1 = 2p0 , p2 = 2p1 , pk = pk−1 , k ≥ 3, and pk = 1.
k−1
k=0
98
Stochastic Processes 2 4. Queueing theory
2) We infer from (6) that p1 = 2p0 and p2 = 2p2 = 4p0 , and for k ≥ 3,
2 22 2k−2 22
pk = pk−1 = pk−2 = · · · = p2 = p0 .
k−1 (k − 1)(k − 2) (k − 1)! (k − 1)!
A check shows that the latter formula is also true for k = 1 and k = 2, thus
2k
pk = p0 , k ∈ N.
(k − 1)!
thus
1 2k 1
p0 = (≈ 0.0634), and pk = · , k ∈ N.
1 + 2e2 (k − 1)! 1 + 2e2
www.maersk.com/mitas
99
Stochastic Processes 2 4. Queueing theory
3) The average length of the queue is (notice that since 1 customer is served, we have here k − 1
instead of k),
∞
∞
∞
k−1 k 2k−2 4e2
(k − 1)pk = 2 p0 = 4 p0 = 4e2 p0 = ≈ 1.873.
(k − 1)! (k − 2)! 1 + 2e2
k=2 k=2 k=2
P {X = k} and P {Y = k}, k ∈ N0 .
100
Stochastic Processes 2 4. Queueing theory
and
∞
k+1
1 2 2 4
E{Y } = k = E{X} = .
3 3 3 3
k=1
μk+1 pk+1 = λk pk .
101
Stochastic Processes 2 4. Queueing theory
It follows from
⎧ ⎫ ⎧ ⎫
∞ k ∞ j ⎬ ⎪ ⎪
∞
2 4 1 ⎨5 4 1 ⎨5 4 1 ⎬
1 = p0 = p 0 1+ + +4 = p0 + = p0 + ·
3 9 3 ⎩3 9 3 ⎭ ⎪
⎩3 9 1⎪⎭
k=0 k=3 j=0 1−
3
5 4 3 5 2 7
= p0 + · = p0 + = p0 ,
3 9 2 3 3 3
that
3 2 4
p0 = , p1 = , p2 = ,
7 7 21
and
k−1
4 1
pk = · , k ≥ 3.
7 3
∞ ∞ k−1 ∞
2 8 4 1 6+8 4 1
k−1
2
kpk = + + k· = + k −1−
7 21 7 3 221 7 3 3
k=1 k=3 k=1
⎧ ⎫
⎪
⎪ ⎪
⎪
⎪
⎨ ⎪
2 4 1 5⎬ 2 4 9 5 2 4 27 − 20 2 1
= + 2 − = + − = + · = + = 1.
3 7⎪ ⎪ 1 3 ⎪
⎪ 3 7 4 3 3 7 4 · 3 3 3
⎪
⎩ 1− ⎪
⎭
3
Choose Accenture for a career where the variety of opportunities and challenges allows you to make a
difference every day. A place where you can develop your potential and grow professionally, working
alongside talented colleagues. The only place where you can learn from our unrivalled experience, while
helping our global clients achieve high performance. If this is your idea of a typical working day, then
Accenture is the place to be.
It all starts at Boot Camp. It’s 48 hours packed with intellectual challenges and intense learning experience.
that will stimulate your mind and and activities designed to let you It could be your toughest test yet,
enhance your career prospects. You’ll discover what it really means to be a which is exactly what will make it
spend time with other students, top high performer in business. We can’t your biggest opportunity.
Accenture Consultants and special tell you everything about Boot Camp,
guests. An inspirational two days but expect a fast-paced, exhilarating Find out more and apply online.
Visit accenture.com/bootcamp
102
Stochastic Processes 2 4. Queueing theory
P {Y = k}, k = 0, 1, 2, 3, 4, (3 dec.).
Now the intensity of arrivals λ is doubled, while the other assumptions are the same as above. This will
imply that the probability of rejection becomes too big, so one decides to hire another shop assistant.
Then the system can be described by a birth and death process with states E 0 , E1 , E2 , E3 , E4 , E5 ,
(where E5 corresponds to 2 customers being served and 3 waiting).
5. Find the equations of this system of the stationary probabilities p0 , p1 , p2 , p3 , p4 , p5 .
1) We have
k
k 1 4
P {X = k} = pk = (1 − ) = , k ∈ N0 ,
5 5
103
Stochastic Processes 2 4. Queueing theory
hence
k+1
4
∞
j k+1
1 4 1 5 4
P {X > k} = = · = , k ∈ N0 .
5 5 5 4 5
j=k+1 1−
5
2) The mean is
∞ k−1
1 4 4 4 1
E{X} = · k = · 2 = 4.
5 5 5 25 4
k=1 1−
5
3) It follows from
μk+1 pk+1 = λk pk ,
that
2
λ 4 4
p1 = p0 = p0 , p2 = p0 ,
μ 5 5
3 4
4 4
p3 = p0 , p4 = p0 ,
5 5
hence
5
4
2 3 4 1− 4
4 4 4 4 5 4
1 = p0 1+ + + + = p0 · = p0 5 − 4 · ,
5 5 5 5 4 5
1−
5
and
1
P {Y = 0} = p0 = 4 ≈ 0.297,
5 − 4 45
4
P {Y = 1} = p1 = p0 ≈ 0.238,
5
4
P {Y = 2} = p2 = p1 ≈ 0.190,
5
4
P {Y = 3} = p3 = p2 ≈ 0.152,
5
4
P {Y = 4} = p4 = p3 ≈ 0.122.
5
4) The mean is
2 3 4
4 4 4 4
E{Y } = 1 · p1 + 2p2 + 3p3 + 4p4 = +2 +3 +4 p0 ≈ 1.563.
5 5 5 5
104
Stochastic Processes 2 4. Queueing theory
It follows from
μk+1 pk+1 = λk pk ,
that
2λ 8
p1 = p0 = ,
μ 5
and
2λ 4
pk = pk−1 = pk−1 for k = 2, 3, 4, 5.
2μ 5
6) Now
k
4
pk = 2 p0 for k = 1, 2, 3, 4, 5,
5
thus
⎧ 5 ⎫
⎪
⎪ 4 ⎪ ⎪
2 3 4 ⎪
⎨ 1− ⎪
8 4 4 4 4 8 5 ⎬
1 = p0 1 + 1+ + + + = p0 1 + ·
5 5 5 5 5 ⎪
⎪ 5 4 ⎪ ⎪
⎪
⎩ 1− ⎪
5 ⎭
5
4
= p0 9 − 8 ,
5
105
Stochastic Processes 2 4. Queueing theory
and hence
1
p0 = 5 ≈ 0, 157,
4
9−8
5
4
p1 = 2· p0 ≈ 0.251,
5
4
p2 = p1 ≈ 0.201,
5
4
p3 = p2 ≈ 0.161,
5
4
p4 = p3 ≈ 0.128,
5
4
p5 = p4 ≈ 0.103.
5
Please click the advert
In Paris or Online
International programs taught by professors and professionals from all over the world
Visit: www.HorizonsUniversity.org
Write: [email protected]
Call: 01.42.77.20.66 www.HorizonsUniversity.org
106
Stochastic Processes 2 4. Queueing theory
Example 4.21 Given two queueing systems A and B, which are independent of each other. We
assume for each of the systems,
a. there is one shop assistant,
b. It is possible to have a queue,
3
c. customers are arriving to A according a Poisson process of intensity λ A = minute−1 , and to B
4
1 −1
according to a Poisson process of intensity λB = minute ,
2
d. the service times at both A and B are exponentially distributed of parameter μ = 1 minute −1 .
Let the random variable XA denote the number of customers in system A, and let XB denote the
number of customers in system B.
1. Find by means of the stationary probabilities,
P {XA < XB } .
The arrivals of the customers at A is now increased, such that the customers arrive according to a
Poisson process of intensity 1 minute−1 . For that reason the two systems are joined to one queueing
system with two shop assistants, thus the customers now arrive according to a Poisson process of
intensity
1 3
λ= 1+ minute−1 = minute−1 ,
2 2
and the service times are still exponentially distributed with the parameter
μ = 1 minut−1 .
P {Y = k}, k ∈ N0 .
5. Prove that the average number of customers in the new system, E{Y }, is smaller than E {X A + XB }.
λA 3
1A. We get from A = = and N = 1 that
μ 4
k
1 3
P {XA = k} = pA,k = , k ∈ N0 .
4 4
107
Stochastic Processes 2 4. Queueing theory
1
1B. Analogously, B = , so
2
k+1
1
P {XB = k} = pB,k = , k ∈ N0 .
2
Then
∞ k−1
2 3 3 3 1 3 · 16 24
E{Y } = · k = · 2 = = ,
7 4 4 14 3 14 7
k=1 1−
4
and
∞ k−1
1 3 3 3
E {XA } = · k = · 16 = 3,
4 4 4 16
k=1
108
Stochastic Processes 2 4. Queueing theory
and
∞ k−1
1 1 1 1
E {XB } = k = · 2 = 1,
4 2 4 1
k=1 1−
2
hence
24
E {XA + XB } = 3 + 1 = 4 > = E{Y }.
7
Example 4.22 Given two independent queueing systems A and B, where we assume for each of them,
a. there is one shop assistant,
b. it is possible to create a queue,
3
c. the customers arrive according to a Poisson process of intensity λ = min−1 ,
5
d. the service times are exponentially distributed of parameter μ = 1 min −1 .
Let the random variable XA denote the number of customers in system A, and let XB denote the
number of customers in system B, and put Z = XA + XB .
1. Compute by means of the stationary probabilities,
P {XA = k} and P {XB = k} , k ∈ N0 .
109
Stochastic Processes 2 4. Queueing theory
thus
k
k j k−j
2 3 2 3
P {Z = k} = P {XA = j} · P {XB = k − j} = ·
j=0 j=0
5 5 5 5
k
4 3
= (k + 1) , k ∈ N0 .
25 5
110
Stochastic Processes 2 4. Queueing theory
Thus
k
1 2 4
P {Y = 0} = and P {Y = k} = , k ∈ N,
9 9 5
and hence
k+1
4
∞ ∞ j k
2 4 2 5 8 4
P {Y > k} = P {Y = j} = = · = .
9 5 9 4 9 5
j=k+1 j=k+1 1−
5
111
Stochastic Processes 2 4. Queueing theory
5) The mean is
∞ k−1
2 4 4 8 1 40
E{Y } = · k = · 2 = .
9 5 5 45 4 9
k=1 1−
5
1) We get from
λ = 3, μ = 2 and N = 2,
that the traffic intensity is
λ 3 3
= = = .
N ·μ 2·2 4
From N = 2 we find the pk by a known formula,.
k
1− 1 k 1− 2 3
p0 = = and pk = 2 · = · , k ∈ N.
1+ 7 1+ 7 4
In particular,
2 3 3 2 9 9
p1 = · = and p2 = · = .
7 4 14 7 16 56
2) The probability that there are more than two customers in the shop is
∞
8 + 12 + 9 29 27
pk = 1 − p 0 − p 1 − p 2 = 1 − =1− = .
56 56 56
k=3
Alternatively,
∞
∞ k 3
2 3 2 3 1 2 3·3·3·4 27
pk = = · = · = .
7 4 7 4 3 7 4·4·4 56
k=3 k=3 1−
4
112
Stochastic Processes 2 4. Queueing theory
113
Stochastic Processes 2 4. Queueing theory
and hence
⎧ 1
⎪
⎪ , k = 0,
⎪
⎨ 11
pk = k
⎪
⎪
⎪
⎩
2
·
5
, k ∈ N.
11 6
www.simcorp.com
114
Stochastic Processes 2 4. Queueing theory
5
4) We have in the new system that N = 1, λ = 5, μ = 6 and = .
6
Then the average waiting time is because N = 1 given by a known formula,
5
6 5
V = = = quarter.
μ(1 − ) 1 6
6·
5
It is seen that the average waiting time is larger in the new system than in the old one.
2. Find by means of the stationary probabilities the average number of customers in the shop.
3. Find by means of the stationary probabilities the average waiting time.
4. Find by means of the stationary probabilities the probability that both shop assistants are busy.
5. Find the median in the stationary distribution.
115
Stochastic Processes 2 4. Queueing theory
Supplement. The average length of queue is also easily found by some known formula,
∞
∞
k ∞ −1 3 3
2 2 2 2 2 1 2 2
(k − 2)pk = (k − 2) ·
= · · = 2 · ·
3 5 3 5 3 2 5 3
k=3 k=3 =1 1−
3
3 2
2·2 ·5 16 2
= = = λV = 8 · .
5 · 33 15 15
4) The complementary event: Both shop assistants are busy with the probability
1 4 7 8
1 − (p0 + p1 ) = 1 − + =1− = .
5 14 15 15
and
1 4 8 9 + 12 + 8 29 1
P {X ≤ 2} = p0 + p1 + p2 = + + = = > .
5 15 45 45 45 2
1
Since both probabilities are ≥ , the median is (X) = 2.
2
116
Stochastic Processes 2 5. Other types of stochastic processes
(Notice that this is not a birth and death process, because the probability of transition from E 3 to
E1 in a small time interval of length h is almost proportional to h.)
2. Find Pi (t), i = 1, 2, 3, 4.
λ
E3 → E1
hence by a rearrangement and taking the limit h → 0 we get the system of differential equations,
⎧
⎪
⎪ P4 (t) = −4λP4 (t), P4 (0) = 1,
⎪
⎪
⎪
⎪
⎪
⎪
⎨ P3 (t) = −3λP3 (t) + 4λP4 (t), P3 (0) = 0,
⎪
⎪
⎪
⎪ P2 (t) = −2λP2 (t) + 2λP3 (t), P2 (0) = 0,
⎪
⎪
⎪
⎪
⎩
P1 (t) = 2λP2 (t) + λP3 (t), P1 (0) = 0.
117
Stochastic Processes 2 5. Other types of stochastic processes
0.8
0.6
0.4
0.2
118
Stochastic Processes 2 5. Other types of stochastic processes
Example 5.2 Let Y and Z be independent N (0, 1) distributed random variables, and let the process
{X(t), t ∈ R} be defined by
Find the mean value function m(t) and the autocorrelation R(s, t).
m(t) = E{X(t)} = E{Y cos t} + E{Z sin t} = cos t · E{Y } + sin t · E{Z} = 0.
The autocorrelation is
119
Stochastic Processes 2 5. Other types of stochastic processes
Example 5.3 Let {X(t), t ≥ 0} denote a Poisson process of intensity a, and let {Y (t), t ≥ 0} be
given by
Compute the mean value function and the autocovariance of {Y (t), t ≥ 0}.
We have
(at)n −at
P {X(t) = n} = e , n ∈ N0 .
n!
The mean value function is obtained by first noticing that
an −a
P {T (t) = n} = P {X(t + 1) − X(t) = n} = P {X(1) = n} = e ,
n!
thus Y (t) = X(1), (The Poisson process is “forgetful”) and
∞
an −λ
m(t) = E{Y (t)} = n e = a.
n=1
n!
If s ≤ t, then
Cov(Y (s), Y (t)) = Cov(X(s + 1) − X(s), X(t + 1) − X(t)) = a · (s + 1 − min{s + 1, t} − s + s)
= a (s + 1 − min{s + 1, t}).
If therefore s + 1 ≤ t, then
Summing up,
⎧
⎨ a{1 − |s − t|}, for |s − t| < 1,
Cov(Y (s), Y (t)) =
⎩
0, for |s − t| ≥ 1.
120
Stochastic Processes 2 5. Other types of stochastic processes
Example 5.4 Let X1 and X2 be independent random variables, both normally distributed of mean 0
and variance σ 2 . We define a stochastic process {X(t), t ∈ R} by
1) Find the mean value function m(t) and the autocorrelation R(s, t).
3) Find the values of s − t, for which the random variables X(s) and X(t) are non-correlated.
4) Given the random variables X(s) and X(t), where s − t is fixed as above. Are X(s) and X(t)
independent?
The autocorrelation is
2) A stochastic process is weakly stationary, if m(t) = m is constant, and C(s, t) = C(s − t). In the
specific case,
m(t) = 0 = m,
and
121
Stochastic Processes 2 5. Other types of stochastic processes
π
4) Since (X(s), X(t)) with s − t = + pπ, p ∈ Z, follows a two-dimensional normal distribution, and
2
X(s) and X(t) are non-correlated, we conclude that they are independent.
Example 5.5 Let {X(t), t ∈ R} be a stationary process of mean 0, autocorrelation R(τ ) and effect
spectrum S(ω).
Let {Y (t), t ∈ R} be defined by
Express the autocorrelation and the effect spectrum of {Y (t)} by the corresponding expressions of
{X(t)} (and a).
Try this...
Please click the advert
122
Stochastic Processes 2 5. Other types of stochastic processes
so
∞
SY (ω) = eiωτ RY (τ ) dτ
−∞
∞ ∞ ∞
iωτ iωτ
= 2 e RX (τ ) dτ − e RX (τ + 2a) dτ − eiωτ RX (τ − 2a) dτ
−∞ −∞ −∞
= 2SX (ω) − e−2iaω SX (ω) − e 2iaω
SX (ω) = 2{1 − cos 2aω}SX (ω)
2
= 4 sin aω SX (ω).
123
Stochastic Processes 2 5. Other types of stochastic processes
Example 5.6 Let {X(t), t ∈ R} be a stationary process of mean 0 and effect spectrum S(ω), and let
n
1
Y = X(kT ), hvor T > 0.
n
k=1
Prove that
1
∞ sin2
nωT
1 2
E Y2 = S(ω) · dω.
2πn2 −∞ 2 1
sin ωT
2
Hint:
2 1
sin nωT n−1
2
= (n − |m|)e−iωmT .
1
sin2 ωT m=−(n−1)
2
First compute
n
n
1
2
E Y = E X(kT )X(mT )
n2
k=1 m=1
n n−1 n
1
= E X(kT )X(kT ) + 2 X(kT )X(mT )
n2
k=1 k=1 m=k+1
n n−1 n−k
1 2
= R(0) + 2 E{X(kT )X((k + m)T )}
n2 n m=1
k=1 k=1
n−1
n−k
n−1 n−m
n 2 n 2
= R(0) + 2 R(mT ) = R(0) + R(mT )
n2 n n2 n2 m=1
k=1 m=1 k=1
n−1
n−1
n 2 1
= 2
R(0) + 2 (n − m)R(mT ) = (n − |m|)R(|m|T ).
n n m=1
n2
m=−(n−1)
Using
124
Stochastic Processes 2 5. Other types of stochastic processes
2) Let 0 < s < t. Find the simultaneous frequency of the two-dimensional random variable {W (s), W (t)}.
R(s, t) = C(s, t) = Cov{W (s), W (t)} = Cov{W (s), W (s) + [W (t) − W (s)]}
= Cov{W (s), W (s)} + Cov{W (s), W (t) − W (s)}
= V {W (s)} + 0 (independent increments)
= α · s.
2) If 0 < s < t, then (W (s), W (t) − W (s)) has the simultaneous frequency
1 1 x2 1 1 y2
f (x, y) = √ exp − · , exp −
2παs 2 αs 2πα(t − s) 2 α(t − s)
125
Stochastic Processes 2 Index
Index
absorbing state, 13, 25 state of a process, 4
Arcus sinus law, 10 stationary distribution, 11, 43, 50
stationary Markov chain, 10
closed subset of states, 13 stochastic limit matrix, 13
convergence in probability, 28 stochastic matrix, 10
cycle, 22 stochastic process, 4
symmetric random walk, 5, 9
discrete Arcus sinus distribution, 10
distribution function of a stochastic process, 4 transition probability, 10, 11
double stochastic matrix, 22, 39
drunkard’s walk, 5 vector of state, 11
Ehrenfest’s model, 32
initial distribution, 11
invariant probability vector, 11, 22, 23, 25, 26,
28, 30, 32, 36, 39
irreducible Markov chain, 12, 18–23, 32, 36, 39,
41, 43, 45, 47, 50, 53, 62, 65, 67, 70,
73, 75, 78, 80, 86, 88, 91, 93, 98, 103,
106, 108, 114, 116, 122, 125, 128, 131
irreducible stochastic matrix, 83, 120
limit matrix, 13
outcome, 5
sample function, 4
126