Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
156 views19 pages

1 Notes On The Poisson Process: 1.1 Point Processes

The document discusses point processes and the Poisson process. It defines point processes, renewal processes, and the elementary renewal theorem. It then introduces the Poisson point process as a renewal process with exponentially distributed interarrival times. The document proves that increments of a Poisson process have a Poisson distribution and discusses the stationary and independent increment properties of the Poisson process.

Uploaded by

Karan Shetty
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
156 views19 pages

1 Notes On The Poisson Process: 1.1 Point Processes

The document discusses point processes and the Poisson process. It defines point processes, renewal processes, and the elementary renewal theorem. It then introduces the Poisson point process as a renewal process with exponentially distributed interarrival times. The document proves that increments of a Poisson process have a Poisson distribution and discusses the stationary and independent increment properties of the Poisson process.

Uploaded by

Karan Shetty
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 19

c 2015 by Karl Sigman

Copyright

Notes on the Poisson Process

We present here the essentials of the Poisson point process with its many interesting
properties. As preliminaries, we first define what a point process is, define the renewal
point process and state and prove the Elementary Renewal Theorem.

1.1

Point Processes

Definition 1.1 A simple point process = {tn : n 1} is a sequence of strictly increasing points
0 < t1 < t2 < ,
(1)
def

with tn as n. With N (0) = 0 we let N (t) denote the number of points that
fall in the interval (0, t]; N (t) = max{n : tn t}. {N (t) : t 0} is called the counting
process for . If the tn are random variables then is called a random point process.
def
We sometimes allow a point t0 at the origin and define t0 = 0. Xn = tn tn1 , n 1,
is called the nth interarrival time.
We view t as time and view tn as the nth arrival time (although there are other kinds of
applications in which the points tn denote locations in space as opposed to time). The
word simple refers to the fact that we are not allowing more than one arrival to ocurr at
the same time (as is stated precisely in (1)). In many applications there is a system to
which customers are arriving over time (classroom, bank, hospital, supermarket, airport,
etc.), and {tn } denotes the arrival times of these customers to the system. But {tn } could
also represent the times at which phone calls are received by a given phone, the times
at which jobs are sent to a printer in a computer network, the times at which a claim
is made against an insurance company, the times at which one receives or sends email,
the times at which an earthquake occurs, the times at which one sells or buys stock, the
times at which a given web site receives hits, or the times at which subways arrive to a
station. Note that
tn = X1 + + Xn , n 1,
the nth arrival time is the sum of the first n interarrival times.
Also note that the event {N (t) = 0} can be equivalently represented by the event
{t1 > t}, and more generally
{N (t) = n} = {tn t, tn+1 > t}, n 1.
In particular, for a random point process, P (N (t) = 0) = P (t1 > t).

1.2

Renewal process

A random point process = {tn } for which the interarrival times {Xn } form an i.i.d.
sequence is called a renewal process. tn is then called the nth renewal epoch and F (x) =
P (X x), x 0, denotes the common interarrival time distribution. To avoid trivialities
we always assume that F (0) < 1, hence ensuring that wp1, tn . The rate of the
def
renewal process is defined as = 1/E(X) which is justified by
Theorem 1.1 (Elementary Renewal Theorem (ERT)) For a renewal process,
N (t)
= w.p.1.
t
t
lim

and

E(N (t))
= .
t
t
t < tN (t)+1 and that tN (t) = X1 + XN (t) , yields after
lim

Proof : Observing that tN (t)


division by N (t):

N (t)
N (t)+1
t
1 X
1 X
Xj

Xj .
N (t) j=1
N (t)
N (t) j=1

By the Strong Law of Large Numbers (SLLN), both the left and the right pieces converge
to E(X) as t. Since t/N (t) is sandwiched between the two, it also converges to
E(X), yielding the first result after taking reciprocals.
For the second result, we must show that the collection of rvs {N (t)/t : t 1} is
uniformly integrable (UI)1 , so as to justify the interchange of limit and expected value,

N (t) 
E(N (t))
= E lim
.
lim
t
t
t
t
We will show that P (N (t)/t > x) c/x2 , x > 0 for some c > 0 hence proving UI. To this
end, choose a > 0 such that 0 < F (a) < 1 (if no such a exists then the renewal process is
n =
deterministic and the result is trival). Define new interarrival times via truncation X

aI{Xn > a}. Thus Xn = 0 with probability F (a) and equals a with probability 1 F (a).
(t) denote the counting process obtained by using these new interarrival times,
Letting N
(t), t 0. Moreover, arrivals (which now occur in batches) can
it follows that N (t) N
now only occur at the deterministic lattice of times {na : n 0}. Letting p = 1 F (a),
and letting Kn denote the number of arrivals that occur at time na, we conclude that
{Kn } is iid with a geometric distribution with success probability p. Letting [x] denote
the smallest integer x, we have the inequality
(t) S(t) =
N (t) N

[t/a]
X

Kn , t 0.

n=1
1

A collection of rvs {Xt : t T } is said to be uniformly integrable (UI), if suptT E(|Xt |I{|Xt | >
x}) 0, as x .

Observing that E(S(t)) = [t/a]E(K) and V ar(S(t)) = [t/a]V ar(K), we obtain E(S(t)2 ) =
V ar(S(t) + E(S(t))2 = [t/a]V ar(K) + [t/a]2 E 2 (K) c1 t + c2 t2 , for constants c1 >
0, c2 > 0. Finally, when t 1, Chebychevs inequality implies that P (N (t)/t > x)
E(N 2 (t))/t2 x2 E(S 2 (t))/t2 x2 c/x2 where c = c1 + c2 .
Remark 1.1 In the elementary renewal theorem, the case when = 0 (e.g., E(X) = )
is allowed, in which case the renewal process is said to be null recurrent. In the case when
0 < < (e.g., 0 < E(X) < ) the renewal process is said to be positive recurrent.

1.3

Poisson point process

There are several equivalent definitions for a Poisson process; we present the simplest
one. Although this definition does not indicate why the word Poisson is used, that will
be made apparent soon. Recall that a renewal process is a point process = {tn : n 0}
in which the interarrival times Xn = tn tn1 are i.i.d. r.v.s. with common distribution
F (x) = P (X x). The arrival rate is given by = {E(X)}1 which is justified by the
ERT (Theorem 1.1).
In what follows it helps to imagine that the arrival times tn correspond to the consecutive times that a subway arrives to your station, and that you are interested in catching
the next subway.
Definition 1.2 A Poisson process at rate 0 < < is a renewal point process in
which the interarrival time distribution is exponential with rate : interarrival times
{Xn : n 1} are i.i.d. with common distribution F (x) = P (X x) = 1 ex , x 0;
E(X) = 1/.
Since tn = X1 + + Xn (the sum of n i.i.d. exponentially distributed r.v.s.), we
conclude that the distribution of tn is the nth -fold convolution of the exponential distribution and thus is a gamma(n, ) distribution (also called an Erlang(n, ) distribution);
its density is given by
(t)n1
, t 0,
(2)
fn (t) = et
(n 1)!
where f1 (t) = f (t) = et is the exponential density, and E(tn ) = E(X1 + + Xn ) =
nE(X) = n/.
For example, f2 is the convolution f1 f1 :
Z t
f2 (t) =
f1 (t s)f1 (s)ds
0
Z t
=
e(ts) dses ds
0
Z t
t
= e
ds
0
t

= e

t,
3

and in general fn+1 = fn f1 = f1 fn .

1.4

The Poisson distribution: A Poisson process has Poisson


increments

Later, in Section 1.5 (Proposition 1.1) we will prove the fundamental fact that: For each
fixed t > 0, the distribution of N (t) is Poisson with mean t:
P (N (t) = k) = et

(t)k
, k 0.
k!

In particular, E(N (t)) = t, V ar(N (t)) = t, t 0. In fact, the number of arrivals in


any arbitrary interval of length t, N (s + t) N (s) is also Poisson with mean t:
P (N (s + t) N (s) = k) = et

(t)k
, s > 0, k 0,
k!

and E(N (s + t) N (s)) = t, V ar(N (s + t) N (s)) = t, t 0.


N (s+t)N (s) is called a length t increment of the counting process {N (t) : t 0}; the
above tells us that the Poisson counting process has increments that have a distribution
that is Poisson and only depends on the length of the increment. Any increment of length
t is distributed as Poisson with mean t.

1.5

Stationary and independent increments characterization of


the Poisson process

Suppose that subway arrival times to a given station form a Poisson process at rate .
If you enter the subway station at time s > 0 it is natural to consider how long you
must wait until the next subway arrives. But tN (s) s < tN (s)+1 ; s lies somewhere
within a subway interarrival time. For example if N (s) = 4 then t4 s < t5 and s
lies somewhere within the interarrival time X5 = t5 t4 . But since the interarrival
times have an exponential distribution, they have the memoryless property and thus
your waiting time, A(s) = tN (s)+1 s, until the next subway, being the remainder of
an originally exponential r.v., is itself an exponential r.v. and independent of the past:
P (A(s) > t) = et , t 0. Once the next subway arrives (at time tN (s)+1 ), the future
interarrival times are i.i.d. exponentials and independent of A(s). But this means that
the Poisson process, from time s onword is yet again another Poisson process with the
same rate ; the Poisson process restarts itself from every time s and is independent of
its past.
In terms of the counting process this means that for fixed s > 0, N (s + t) N (s)
(the number of arrivals during the first t time units after time s, the future) has the
same distribution as N (t) (the number of arrivals during the first t time units), and is
independent of {N (u) : 0 u s} (the counting process up to time s, the past).
4

This above discussion illustrates the stationary and independent increments properties,
to be discussed next. It also shows that that {N (t) : t 0} is a continuous-time Markov
process: The future {N (s + t) : t > 0}, given the present state N (s), is independent of
the past {N (u) : 0 u < s}.
Definition 1.3 A random point process is said to have stationary increments if for all
t 0 and all s 0 it holds that N (t+s)N (s) (the number of points in the time interval
(s, s + t]) has a distribution that only depends on t, the length of the time interval.
For any interval I = (a, b], let N (I) = N (b) N (a) denote the number of points that
fall in the interval. More generally, for any subset A R+ , let N (A) denote the number
of points that fall in the subset A.
Definition 1.4 is said to have independent increments if for any two non-overlapping
intervals of time, I1 and I2 , the random variables N (I1 ) and N (I2 ) are independent.
We conclude from the discussions above that
The Poisson process has both stationary and independent increments.
But what is this distribution of N (t + s) N (s) that only depends on t, the length
of the interval? We now show that it is Poisson for the Poisson process:
Proposition 1.1 For a Poisson process at rate , the distribution of N (t), t > 0, is
Poisson with mean t:
k
t (t)
, k 0.
P (N (t) = k) = e
k!
In particular, E(N (t)) = t, V ar(N (t)) = t, t 0. Thus by stationary increments,
N (s + t) N (s) is also Poisson with mean t:
P (N (s + t) N (s) = k) = et

(t)k
, s > 0, k 0,
k!

and E(N (s + t) N (s)) = t, V ar(N (s + t) N (s)) = t, t 0.


Proof : Note that P (N (t) = n) = P (tn t < tn+1 ) = P (tn+1 > t) P (tn > t). We will
show that
(t)m1
P (tm > t) = et (1 + t + +
), m 1,
(3)
(m 1)!
so that substituting in m = n + 1 and m = n and subtracting yields the result.
To this end, observe that differentiating the tail Qn (t) = P (tn > t) (recall that tn has
the gamma(n, ) density in (2)) yields
(t)n1
d
Qn (t) = fn (t) = et
, Qn (0) = 1.
dt
(n 1)!

Differentiating the right hand side of (3), we see that (3) is in fact the solution (antiderivative).

Because of the above result, we see that = E(N (1)); the arrival rate is the
expected number of arrivals in any length one interval.

Examples
1. Suppose cars arrive to the GW Bridge according to a Poisson process at rate =
1000 per hour. What is the expected value and variance of the number of cars to
arrive during the time interval between 2 and 3 oclock PM?
E(N (3) N (2)) = E(N (1)) by stationary increments. E(N (1)) = 1 = 1000.
Variance is the same, V ar(N (1)) = 1 = 1000.
2. (continuation)
What is the expected number of cars to arrive during the time interval between 2
and 3 oclock PM, given that 700 cars already arrived between 9 and 10 oclock
this morning?
E(N (3) N (2)|N (10) N (9) = 700) = E(N (3) N (2)) = E(N (1)) = 1000
by independent and stationary increments: the r.v.s. N (I1 ) = N (3) N (2) and
N (I2 ) = N (10) N (9) are independent.
3. (continuation) What is the probability that no cars will arrive during a given 15
minute interval?
P (N (0.25) = 0) = e(0.25) = e250 .

Remarkable as it may seem, it turns out that the Poisson process is completely characterized by stationary and independent increments:
Theorem 1.2 Suppose that is a simple random point process that has both stationary
and independent increments. Then in fact, is a Poisson process. Thus the Poisson
process is the only simple point process with stationary and independent increments.
Proof :[Sketch]
Note that
P (X1 > s + t) =
=
=
=

P (N (s + t) = 0)
P (N (s) = 0, N (s + t) N (s) = 0)
P (N (s) = 0)P (N (t) = 0) (via independent and stationary increments)
P (X1 > s)P (X1 > t),
6

But this implies that X1 has the memoryless property, and thus it must be exponentially
distributed; P (X1 t) = 1 et for some > 0.
But by stationary and independent increments, right after X1 , the counting process
starts over again and is independent of its past; thus X2 is independent of X1 and has
the same distribution. Continuing in this fashion we conclude that the interarrival times
are iid with an exponential distribution at rate .

We now have two different ways of identifying a Poisson process: (1) checking if it is a renewal process with an exponential interarrival time distribution,
or (2) checking if it has both stationary and independent increments.

1.6

Constructing a Poisson process from independent Bernoulli


trials, and the Poisson approximation to the binomial distribution

A Poisson process at rate can be viewed as the result of performing an independent


Bernoulli trial with success probability p = dt in each infinitesimal time interval of
length dt, and placing a point there if the corresponding trial is a success (no point
there otherwise). Intuitively, this would yield a point process with both stationary and
independent increments; a Poisson process: The number of Bernoulli trials that can be
fit in any interval only depends on the length of the interval and thus the distribution for
the number of successes in that interval would also only depend on the length; stationary
increments follows. For two non-overlapping intervals, the Bernoulli trials in each would
be independent of one another since all the trials are i.i.d., thus the number of successes
in one interval would be independent of the number of successes in the other interval;
independent increments follows. We proceed next to explain how this Bernoulli trials
idea can be made rigorous.
As explained in Remark 1.2, the exponential distribution can be obtained as a limit of
the geometric distribution: Fix n large, and perform, using success probability pn = /n,
an independent Bernoulli trial at each time point i/n, i 1. Let Yn denote the time
at which the first success ocurred. Then Yn = Kn /n where Kn denotes the number of
trials until the first success, and has the geometric distribution with success probability
pn . As n , Yn converges to a r.v. Y having the exponential distribution with rate
. This Y thus can serve as the first arrival time t1 for a Poisson process at rate . The
idea here is that the tiny intervals of length 1/n become (in the limit) the infinitesimal dt
intervals. Once we have our first success, at time t1 , we continue onwards in time (in the
interval (t1 , )) with new Bernoulli trials until we get the second success at time t2 and
so on until we get all the arrival times {tn : n 1}. By construction, each interarrival
time, Xn = tn tn1 , n 1, is an independent exponentially distributed r.v. with rate
; hence we constructed a Poisson process at rate .
7

Another key to understanding how the Poisson process can be constructed from
Bernoulli trials is the fact that the Poisson distribution can be used to approximate
the binomial distribution:
Proposition 1.2 For > 0 fixed, let X binomial(n, p) with success probability pn =
/n. Then, as n , X converges in distribution to a Poisson rv with mean . Thus,
a binomial distribution in which the number of trials n is large and the success probability
p is small can be approximated by a Poisson distrbution with mean = np.
Proof : Since
 
n k
P (X = k) =
p (1 p)nk ,
k
where p = /n, we must show that for any k 0
 
n
k
lim
(/n)k (1 /n)nk = e .
n k
k!
re-writing and expanding yields
 
n
n!
k (1 /n)n
k
nk
(/n) (1 /n)
=

,
k
(n k)! nk
k!
(1 /n)k
the product of three pieces.
But limn (1/n)k = 1 since k is fixed, and from calculus, limn (1/n)n = e .
(Recall a proof via taking natural logarithms and then using LHosptals Rule.) Moreover,
n (n 1)
(n k + 1)
n!
=

,
k
(n k)! n
n
n
n
(the product of k (fixed) pieces) and hence converges to 1 as n since each of the k
ratios does so. Combining these facts yields the result.

We can use the above result to construct the counting process at time t, N (t), for
a Poisson process as follows: Fix t > 0. Divide the interval (0, t] into n subintervals,
((i 1)t/n, it/n], 1 i n, of the equal length t/n. At the right endpoint it/n of each
such subinterval, perform a Bernoulli trial with success probability pn = t/n, and place
a point there if successful (no point otherwise). Let Nn (t) denote the number of such
points placed (successes). Then Nn (t) binomial(n, pn ) and converges in distribution to
N (t) P oisson(t), as n . Moreover, the points placed in (0, t] from the Bernoulli
trials converge (as n ) to the points t1 , . . . , tN (t) of the Poisson process during (0, t].
So we have actually obtained the Poisson process up to time t.

1.7

Little o(t) results for stationary point processes

Let o(t) denote any function of t that satisfies o(t)/t 0 as t 0. Examples include
o(t) = tn , n > 1, but there are many others.
If is any point process with stationary increments and = E(N (1)) < , then
(see below for a discussion of proofs)
P (N (t) > 0) = t + o(t),
P (N (t) > 1) = o(t).

(4)
(5)

Because of stationary increments we get the same results for any increment of length
t, N (s + t) N (s), and in words (4) can be expressed as
P (at least 1 arrival in any interval of length t) = t + o(t),
whereas (5) can be expressed as
P (more than 1 arrival in any interval of length t) = o(t).
Since P (N (t) = 1) = P (N (t) > 0) P (N (t) > 1), (4) and (5) together yield
P (N (t) = 1) = t + o(t),

(6)

or in words
P ( An arrival in any interval of length t) = t + o(t).
We thus get for any s 0:
P (N (s + t) N (s) = 1) t, for t small,
which using infinitesimals can be written as
P (N (s + dt) N (s) = 1) = dt.

(7)

The above o(t) results hold for any (simple) point process with stationary increments,
not just a Poisson process. But note how (7) agrees with our Bernoulli trials interpretation of the Poisson process, e.g., performing in each interval of length dt an independent
Bernoulli trial with success probability p = dt. But the crucial difference is that our
Bernoulli trials construction also yields the independent increments property which is
not expressed in (7). This difference helps explain why the Poisson process is the unique
simple point process with both stationay and independent increments: There are numerous examples of point processes with stationary increments (we shall offer some examples
later), but only one with both stationary and independent increments; the Poisson process.
Although a general proof of (4) and (5) is beyond the scope of this course, we will
be satisfied with proving it for the Poisson process at rate for which it follows directly
from the Taylors expansion for ex :
9

P (N (t) > 0) = 1 et
= 1 (1 t +

(t)2
+ )
2

(t)2
+ )
2
= t + o(t).

= t +

P (N (t) > 1) = P (N (t) = 2) + P (N (t) = 3) +


(t)2
= et (
+ )
2
(t)2
(
+ )
2
= o(t)

1.8

Partitioning Theorems for Poisson processes and random


variables

Given X P oiss() (a Poisson rv with mean ) suppose that we imagine that X denotes
some number of objects (arrivals during some fixed time interval for example), and that
independent of one another, each such object is of type 1 or type 2 with probability p
and q = 1 p respectively. This means that if X = n then the number of those n that
are of type 1 has a binomial(n, p) distribution and the number of those n that are of
type 2 has a binomial(n, q) distribution. Let X1 and X2 denote the number of type 1
and type 2 objects respectively ; X1 + X2 = X. The following shows that in fact if we
do this, then X1 and X2 are independent Poisson random variables with means p and
q respectively.
Theorem 1.3 (Partitioning a Poisson r.v.) If X P oiss() and if each object of
X is, independently, type 1 or type 2 with probability p and q = 1 p, then in fact
X1 P oiss(p), X2 P oiss(q) and they are independent.
Proof : We must show that
P (X1 = k, X2 = m) = ep

(p)k q (q)m
e
.
k!
m!

(8)

P (X1 = k, X2 = m) = P (X1 = k, X = k + m) = P (X1 = k|X = k + m)P (X = k + m).


10

But given X = k + m, X1 Bin(k + m, p) yielding


P (X1 = k|X = k + m)P (X = k + m) =

(k + m)! k m k+m
p q e
.
k!m!
(k + m)!

Using the fact that e = ep eq and other similar algabraic identites, the above reduces
to (8) as was to be shown.
The above theorem generalizes to Poisson processes:
Theorem 1.4 (Partitioning a Poisson process) If P P () and if each arrival
of is, independently, type 1 or type 2 with probability p and q = 1 p then in fact,
letting i denote the point process of type i arrivals, i = 1, 2, 1 P P (p), 2 P P (q)
and they are independent.
Proof : It is immediate that each i is a Poisson process since each remains having
stationary and independent increments. Let N (t) and Ni (t), i = 1, 2 denote the corresponding counting processes, N (t) = N1 (t) + N2 (t), t 0. From Theorem 1.3,
N1 (1) and N2 (1) are independent Poisson rvs with means E(N1 (1)) = 1 = p and
E(N1 (1)) = 2 = q since they are a partitioning of N (1); thus i indeed has rate
i , i = 1, 2. What remains to show is that the two counting processes (as processes) are
independent. But this is immediate from Theorem 1.3 and independent increments of
: If we take any collection of non-overlapping intervals (sets more generally) A1 , . . . Ak
and non-overlapping intervals B1 , . . . Bl then we must show that (N1 (A1 ), . . . , N1 (Ak ))
is independent of (N2 (B1 ), . . . , N2 (Bl )) argued as follows: Any part (say subset C) of
the Ai which intersect with the Bi will yield independence due to partitioning of the rv
N (C), and any parts of the Ai that are disjoint from the Bi will yield independence due
to the independent increments of ; thus independence follows.
The above is quite interesting for it means that if Poisson arrivals at rate come to
our lecture room, and upon each arrival we flip a coin (having probability p of landing
heads), and route all those for which the coin lands tails (type 2) into a different room,
only allowing those for which the coin lands heads (type 1) enter our room, then the
arrival processes to the two rooms are independent and Poisson.
For example, suppose that = 30 per hour, and p = 0.6. Letting N1 (t) and N2 (t)
denote the counting processes for type 1 and type 2 respectively, this means that N1 (t)
P oiss() where = (0.6)(30)(t) = 18t. Now consider the two events
A = {5 arrivals into room 1 during the hours 1 to 3}
and
B = {1000 arrivals into room 2 during the hours 1 to 3}.

11

We thus conclude that the two events A and B are independent yielding
P (A|B) = P (A)
= P (N1 (3) N1 (1) = 5)
= P (N1 (2) = 5)
365
= e36
.
5!
In the above computation, the third equality follows from stationary increments (of type
1 arrivals since they are Poisson at rate 18).
1.8.1

Supersposition of independent Poisson processes

In the previous section we saw that a Poisson process can be partitioned into two
independent ones 1 and 2 (type 1 and type 2 arrivals). But this means that they can
be put back together again to obtain . Putting together means taking the superposition
of the two point processes, that is, combining all their points together, then placing them
in acsending order, to form one point process (regardless of type). We write this as
= 1 + 2 , and of course we in particular have N (t) = N1 (t) + N2 (t), t 0.
A little thought reveals that therefore we can in fact start with any two independent
Poisson processes, 1 P P (1 ), 2 P P (2 ) (call them type 1 and type 2) and
superpose them to obtain a Poisson process = 1 + 2 at rate = 1 + 2 . The
partitioning probability p is given by
p=

1
,
1 + 2

because that is the required p which would allow us to partition a Poisson process with
rate = 1 + 2 into two independent Poisson processes at rate 1 and 2 ; p = 1 and
(1 p) = 2 as is required. p is simply the probability that (starting from any time t)
the next arrival time of type 1 (call this Y1 ) ocurrs before the next arrival time of type 2
1
because
(call this Y2 ), which by the memoryless proberty is given by P (Y1 < Y2 ) = 1+
2
Y1 exp(1 ), Y2 exp(2 ) and they are independent by assumption. Once an arrival
ocurrs, the memoryless property allows us to conclude that the next arrival will yet again
be of type 1 or 2 (independent of the past) depending only on which of two independent
1
exponentialy distributed r.v.s. is smaller; P (Y1 < Y2 ) = 1+
. Continuing in this
2
fashion we conclude that indeed each arrival from the superposition is, independent of
1
all others, of type 1 or type 2 with probability p = 1+
.
2
Arguing directly that the superposition of independent Poisson processes yields a Poisson process is easy: The superposition has both stationary and independent increments,
and thus must be a Poisson process. Moreover E(N (1)) = E(N1 (1))+E(N2 (1)) = 1 +2 ,
so the rate indeed is given by = 1 + 2 .

12

Examples
Foreign phone calls are made to your home phone according to a Poisson process at rate
1 = 2 (per hour). Independently, domestic phone calls are made to your home phone
according to a Poisson process at rate 2 = 5 (per hour).
1. You arrive home. What is the probability that the next call will be foreign? That
the next three calls will be domestic?
2
1
= 2/7, ( 1+
)3 = (5/7)3 . Once a domestic call comes in, the
Answers: 1+
2
2
future is independent of the past and has the same distribution as when we started
by the memoryless property, so the next call will, once again be domestic with the
same probability 5/7 and so on.

2. You leave home for 2 hours. What is the mean and variance of the number of calls
you recieved during your absence?
Answer: The superposition of the two types is a Poisson process at rate =
1 + 2 = 7. Letting N (t) denote the number of calls by time t, it follows that N (t)
has a Poisson distribution with parameter t; E(N (2)) = 2 = 14 = V ar(N (2)).
3. Given that there were exactly 5 calls in a given 4 hour period, what is the probability
that exactly 2 of them were foreign?
Answer: The superposition of the two types is a Poisson process at rate =
1 + 2 = 7. The individual foriegn and domestic arrival processes can be viewed
1
= 2/7. Thus given N (4) = 5,
as type 1 and 2 of a partitioning with p = 1+
2
the number of those 5 that are foriegn (type 1) has a Bin(5, p) distribution. with
p = 2/7. Thus we want
 
5 2
p (1 p)3 .
2

1.9

Constructing a Poisson process up to time t by using the


order statistics of iid uniform rvs

Suppose that for a Poisson process at rate , we condition on the event {N (t) = 1}, the
event that exactly one arrival ocurred during (0, t]. We might conjecture that under such
conditioning, t1 should be uniformly distributed over (0, t). To see that this is in fact so,

13

choose s (0, t). Then


P (t1 s, N (t) = 1)
P (N (t) = 1)
P (N (s) = 1, N (t) N (s) = 0)
=
P (N (t) = 1)
s
(ts)
e se
=
et t
s
= .
t

P (t1 s|N (t) = 1) =

It turns out that this result generalizes nicely to any number of arrivals, and we
present this next.
Let V1 , V2 , . . . , Vn be n i.i.d uniformly distributed r.v.s. on the interval (0, t). Let
V(1) < V(2) < < V(n) denote them placed in ascending order. Thus V(1) is the smallest
of them , V(2) the second smallest and finally V(n) is the largest one. V(i) is called the ith
order statistic of V1 , . . . Vn .
Note that the joint density function of (V1 , V2 , . . . , Vn ) is given by
g(s1 , s2 , . . . , sn ) =

1
, si (0, t),
tn

because each Vi has density function 1/t and they are independent. Now given any
ascending sequence 0 < s1 < s2 < < sn < t it follows that the joint distribution
f (s1 , s2 , . . . , sn ) of the order statistics (V(1) , V(2) , . . . , V(n) ) is given by
f (s1 , s2 , . . . , sn ) =

n!
,
tn

because there are n! permutations of the sequence (s1 , s2 , . . . , sn ) each of which leads
to the same order statistics. For example if (s1 , s2 , s3 ) = (1, 2, 3) then there are 3! = 6
permutations all yielding the same order statistics (1, 2, 3): (1, 2, 3), (1, 3, 2), (2, 1, 3),
(2, 3, 1), (3, 1, 2), (3, 2, 1).
Theorem 1.5 For any Poisson process, conditional on the event {N (t) = n}, the joint
distribution of the n arrival times t1 , . . . , tn is the same as the joint distribution of
V(1) , . . . , V(n) , the order statistics of n i.i.d. unif (0, t) r.v.s., that is, it is given by
f (s1 , s2 , . . . , sn ) =

n!
, 0 < s1 < s2 < < sn < t
tn

Proof : We will compute the density for


P (t1 = s1 , . . . , tn = sn |N (t) = n) =
14

P (t1 = s1 , . . . , tn = sn , N (t) = n)
P (N (t) = n)

and see that it is precisely tn!n . To this end, we re-write the event {t1 = s1 , . . . , tn =
sn , N (t) = n} in terms of the i.i.d. interarrival times as {X1 = s1 , . . . , Xn = sn
sn1 , Xn+1 > tsn }. For example if N (t) = 2, then {t1 = s1 , t2 = s2 , N (t) = 2} = {X1 =
s1 , X2 = s2 s1 , X3 > t s2 } and thus has density es1 e(s2 s1 ) e(ts2 ) = 2 et
due to the independence of the r.v.s. X1 , X2 , X3 , and the algebraic cancellations in the
exponents.
We conclude that
P (t1 = s1 , . . . , tn = sn , N (t) = n)
P (N (t) = n)
P (X1 = s1 , . . . , Xn = sn sn1 , Xn+1 > t sn )
=
P (N (t) = n)
n t
e
=
P (N (t) = n)
n!
= n,
t

P (t1 = s1 , . . . , tn = sn |N (t) = n) =

where the last equality follows since P (N (t) = n) = et (t)n /n!.

The importance of the above is this: If you want to simulate a Poisson process up to
time t, you need only first simulate the value of N (t) (a Poisson rv with mean t, use,
for example the discrete inverse transform method), then if N (t) = n generate n i.i.d.
U nif (0, t) numbers (V1 , V2 , . . . , Vn ), place them in ascending order (V(1) , V(2) , . . . , V(n) )
and finally define ti = V(i) , 1 i n.
Uniform numbers are very easy to generate on a computer and so this method can
have computational advantages over simply generating exponential r.v.s. for interarrival
times Xn , and defining tn = X1 + + Xn . Exponential r.v.s. require taking logarithms
to generate:
1
Xi = log(Ui ),

where Ui U nif (0, 1) and this can be computationally time consuming.

1.10

Applications

1. A bus platform is now empty of passengers, and the next bus will depart in t
minutes. Passengers arrive to the platform according to a Poisson process at rate
. What is the average waiting time of an arriving passenger?
Answer: Let {N (t)} denote the counting process for passenger arrivals. Given
N (t) = n 1, we can treat the n passenger arrival times t1 , . . . , tn as the order
statistics V(1) < V(2) < < V(n) of n independent unif (0, t) r.v.s., V1 , V2 , . . . , Vn .

15

We thus expect that on average a customer waits E(V ) = t/2 units of time. This
indeed is so, proven as follows: The ith arrival has waiting time Wi = t ti , and
there will be N (t) such arrivals. Thus we need to compute E(T ) where
N (t)

1 X
T =
(t ti ).
N (t) i=1
(We only consider the case when N (t) 1.)
But given N (t) = n, we conclude that
n

1X
T =
(t V(i) )
n i=1
n

1X
(t Vi ),
n i=1

because the sum of all n of the V(i) is the same as the sum of all n of the Vi . Thus
E(T |N (t) = n)) =

t
1
nE(t V ) = E(t V ) = .
n
2

This being true for all n 1, we conclude that E(T ) = 2t .


2. M/GI/ queue: Arrival times tn form a Poisson process at rate with counting
process {N (t)}, service times Sn are i.i.d. with general distribution G(x) = P (S
x) and mean 1/. There are an infinite number of servers and so there is no delay:
the nth customer arrives at time tn , enters service immediately at any free server
and then departs at time tn + Sn ; Sn is the length of time the customer spends in
service. Let X(t) denote the number of customers in service at time t. We assume
that X(0) = 0.
We will now show that
Proposition 1.3 For every fixed t > 0, the distribution of X(t) is Poisson with
parameter (t) where
Z t
(t) =
P (S > x)dx,
0

((t))n
, n 0.
n!
Thus E(X(t) = (t) = V ar(X(t)). Moreover since (t) converges (as t ) to
Z

P (S > x)dx = E(S) = ,

0
P (X(t) = n) = e(t)

16

we conclude that

n
, n 0,
t
n!
where = /. So the limiting (or steady-state, or stationary) distribution of X(t)
exists and is Poisson with parameter .
lim P (X(t) = n) = e

Proof : The method of proof is actually based on partitioning the Poisson random
variable N (t) into two types: those that are in service at time t, X(t), and those
that have departed by time t (denoted by D(t)). Thus we need only figure out
what is the probability p(t) that a customer who arrived during (0, t] (that is, one
of the N (t) arrivals) is still in service at time t.
We first recall that conditional on N (t) = n the n (unordered) arrival times are
i.i.d. with a U nif (0, t) distribution. Letting V denote a typical such arrival time,
and S their service time, we conclude that this customer will still be in service at
time t if and only if V + S > t (arival time + service time > t); S > t V . Thus
p(t) = P (S > t V ), where S and V are assumed independent. But (as is easily
shown)) tV U nif (0, t) if V U nif (0, t). Thus p = P (S > tV ) = P (S > V ).
Noting that P (S > V |V = s) = P (S > s) we conclude that
Z
1 t
P (S > s)ds,
p(t) = P (S > V ) =
t 0
where we have conditioned on the value of V = s (with density 1/t) and integrated
over all such values. This did not depend upon the value n and so we are done.
Thus for fixed t, we can partition N (t) into two independent Poisson r.v.s., X(t)
and D(t) (the number of departures by t), to conclude that X(t) P oiss((t))
where
Z t
P (S > x)dx

(t) = tp(t) =
0

as was to be shown. Similarly D(t) P oisson((t)) where (t) = t(1 p(t)).


Recall that

j
,
t
j!
that is, that the limiting (or steady-state or stationary) distribution of X(t) is
Poisson with mean . Keep in mind that this implies convergence in a time average
sense also:
Z
1 t
j
pj = lim
P (X(s) = j)ds = e ,
t t 0
j!
which is exactly the continuous time analog of the stationary distribution for
Markov chains:
n
1X
j = lim
P (Xk = j).
n n
k=1
def

pj = lim P (X(t) = j) = e

17

Thus we interpret pj as the long run proportion of time that there are j busy
servers. The average number of busy severs is given by the mean of the limiting
distribution:
L=

jpj = .

j=0

Finally note that the mean agrees with our Littles Law (L = w) derivation
of the time average number in system:
Z
1 t
L = lim
X(s)ds = .
t t 0
Customers arrival at rate and have an average sojourn time of 1/ yielding L = .
In short, the time average number in system is equal to the mean of the limiting
distribution {pj } for number in system.
Final comment: In proving Proposition 1.3, as with the Example 1, we do need to
use the fact that summing up over all the order statistics is the same as summing up
over the non ordered uniforms. When N (t) = n, we have
X(t) =

n
X

I{Sj > t V(j) }.

j=1

But since service times are i.i.d. and independent of the uniforms, we see that this is the
same in distribution as the sum
n
X

I{Sj > t Vj }.

j=1

Since the I{Sj > t Vj } are i.i.d. Bernoulli(p) r.v.s. with p = P (S > V ), we conclude
that the conditional distribution of X(t) given that N (t) = n, is binomial with success
probability p. Thus X(t) indeed can be viewed as a partitioned N (t) with partition
probability p = P (S > V ).
Example
Suppose people in NYC buy advance tickets to a movie according to a Poisson process
at rate = 500 (per day), and that each buyer independent of all others keeps the ticket
(before using) for an amount of time that is distributed as G(x) = P (S x) = x/4, x
(0, 4) (days), the uniform distribution over (0, 4). Assuming that no one owns a ticket at
time t = 0, what is the expected number of ticket holders at time t = 2 days? 5 days?,
5 years?
18

Answer: we want E(X(2)) = (2) and E(X(5)) = (5) and E(X(5360)) = (1800)
for the M/G/ queue in which
Z t
P (S > x)dx.
(t) = 500
0

Here P (S > x) = 1 x/4, x (0, 4) but P (S > x) = 0, x 4. Thus


Z 2
(1 x/4)dx = 500(3/2) = 750,
(2) = 500
0

and
Z

(1 x/4)dx = 500E(S) = 500(2) = 1000.

(5) = (1800)) = 500


0

The point here is that (t) = E(S) = = 1000, t 4: From time t = 4 (days) owards,
the limiting distribution is already reached (no need to take the limit t ). It is
Poisson with mean = 1000; the distribution of X(t) at time t = 5 days is the same as
at time t = 5 years.
If S has an exponential distribution with mean 2 (days), P (S > x) = e.5x , x 0,
then the answers to the above questions would change. In this case
Z t
(t) = 500
e.5x dx = 1000(1 e.5t ), t 0.
0

The limiting distribution is the same, (Poisson with mean = 1000) but we need to take
the limit t to reach it.
Finally note that if (say) S has a uniform distribution over an interval (a, b) with
a > 0, then P (S > x) = 1, x [0, a) and we must use that fact when computing (t).
For example, suppose
R 1 S is uniform
R 2 over (1, 3), and we want to compute (2).
Then (2) = 0 1dx + 1 (3 x)/2dx =
R2
+ 1 (3 x)/2dx =
(1 + 3/4).

19

You might also like