Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
282 views8 pages

Poisson Process Basics

The Poisson process is used to model random events that occur at a constant average rate. It has three main properties: 1) the number of events in any time period has a Poisson distribution, 2) non-overlapping time periods have independent numbers of events, and 3) it has stationary increments where the distribution depends only on the time period length. The Poisson process can be constructed as the limit of a Bernoulli process as the time intervals approach zero. It is widely applied to modeling natural phenomena like earthquakes as well as human-made processes like website traffic.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
282 views8 pages

Poisson Process Basics

The Poisson process is used to model random events that occur at a constant average rate. It has three main properties: 1) the number of events in any time period has a Poisson distribution, 2) non-overlapping time periods have independent numbers of events, and 3) it has stationary increments where the distribution depends only on the time period length. The Poisson process can be constructed as the limit of a Bernoulli process as the time intervals approach zero. It is widely applied to modeling natural phenomena like earthquakes as well as human-made processes like website traffic.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

11.1.

2 Basic Concepts of the Poisson Process


The Poisson process is one of the most widely-used counting processes. It is usually used in
scenarios where we are counting the occurrences of certain events that appear to happen at a
certain rate, but completely at random (without a certain structure). For example, suppose that
from historical data, we know that earthquakes occur in a certain area with a rate of 2 per
month. Other than this information, the timings of earthquakes seem to be completely
random. Thus, we conclude that the Poisson process might be a good model for earthquakes. In
practice, the Poisson process or its extensions have been used to model [24]

− the number of car accidents at a site or in an area;


− the location of users in a wireless network;

− the requests for individual documents on a web server;

− the outbreak of wars;

− photons landing on a photodiode.

Poisson random variable: Here, we briefly review some properties of the Poisson random
variable that we have discussed in the previous chapters. Remember that a discrete random
variable X is said to be a Poisson random variable with parameter μ , shown as X ∼ P oisson(μ), if
its range is R = {0, 1, 2, 3, . . . }, and its PMF is given by
X

−μ k
e μ
for k ∈ RX
PX (k) = { k!

0  otherwise

Here are some useful facts that we have seen before:

1. If X ∼ P oisson(μ), then EX = μ, and Var(X) = μ.


2. If X ∼ P oisson(μ ), for i = 1, 2, ⋯ , n, and the X 's are independent, then
i i i

X1 + X2 + ⋯ + Xn ∼ P oisson(μ1 + μ2 + ⋯

+ μn ).

3. The Poisson distribution can be viewed as the limit of binomial distribution.

Theorem 11.1
Let Y ∼ Binomial(n, p = p(n)). Let μ > 0 be a fixed real number, and lim
n np = μ . Then, the PMF of
n→∞

Ynconverges to a P oisson(μ) PMF, as n → ∞. That is, for any k ∈ {0, 1, 2, . . . }, we have


−μ k
e μ
lim PY (k) = .
n
n→∞ k!

Poisson Process as the Limit of a Bernoulli Process:


Suppose that we would like to model the arrival of events that happen completely at random
at a rate λ per unit time. Here is one way to do this. At time t = 0, we have no arrivals yet, so
N (0) = 0 . We now divide the half-line [0, ∞) to tiny subintervals of length δ as shown in Figure

11.2.

Figure 11.2 - Dividing the half-line [0, ∞) to tiny subintervals of length δ.

Each subinterval corresponds to a time slot of length δ. Thus, the intervals are (0, δ] , (δ, 2δ], (2δ, 3δ],
⋯ . More generally, the kth interval is ((k − 1)δ, kδ] . We assume that in each time slot, we toss a

coin for which P (H ) = p = λδ . If the coin lands heads up, we say that we have an arrival in that
subinterval. Otherwise, we say that we have no arrival in that interval. Figure 11.3 shows this
process. Here, we have an arrival at time t = kδ , if the kth coin flip results in a heads.

Figure 11.3 - Poisson process as a limit of a Bernoulli process.

Now, let N (t) be defined as the number of arrivals (number of heads) from time 0 to time t.
There are n ≈ time slots in the interval (0, t]. Thus, N (t) is the number of heads in n coin flips.
t

We conclude that N (t) ∼ Binomial(n, p). Note that here p = λδ , so


np = nλδ

t
= ⋅ λδ
δ

= λt.

Thus, by Theorem 11.1, as δ → 0, the PMF of N (t) converges to a Poisson distribution with rate λt .
More generally, we can argue that the number of arrivals in any interval of length τ follows a
P oisson(λτ ) distribution as δ → 0.

Consider several non-overlapping intervals. The number of arrivals in each interval is


determined by the results of the coin flips for that interval. Since different coin flips are
independent, we conclude that the above counting process has independent increments.

Definition of the Poisson Process:

The above construction can be made mathematically rigorous. The resulting random process is
called a Poisson process with rate (or intensity) λ. Here is a formal definition of the Poisson
process.
The Poisson Process

Let λ > 0 be fixed. The counting process {N (t), t ∈ [0, ∞)} is called a Poisson process with rates λ if
all the following conditions hold:

1. N (0) = 0;
2. N (t) has independent increments;
3. the number of arrivals in any interval of length τ > 0 has P oisson(λτ ) distribution.

Note that from the above definition, we conclude that in a Poisson process, the distribution of
the number of arrivals in any interval depends only on the length of the interval, and not on
the exact location of the interval on the real line. Therefore the Poisson process has stationary
increments.

Example 11.1
The number of customers arriving at a grocery store can be modeled by a Poisson process with
intensity λ = 10 customers per hour.

1. Find the probability that there are 2 customers between 10:00 and 10:20.
2. Find the probability that there are 3 customers between 10:00 and 10:20 and 7 customers
between 10:20 and 11.

Solution
1. Here, λ = 10 and the interval between 10:00 and 10:20 has length τ =
1

3
hours.
Thus, if X is the number of arrivals in that interval, we can write
X ∼ P oisson(10/3). Therefore,

10
2
− 10
e 3
( )
3
P (X = 2) =
2!

≈ 0.2

2. Here, we have two non-overlapping intervals I 1 = (10:00 a.m., 10:20 a.m.] and
I = (10:20 a.m., 11 a.m.]. Thus, we can write
2

P (3 arrivals in I1    and   7 arrivals in I2 ) =

P (3 arrivals in I1 ) ⋅ P (7 arrivals in I2 ).

Since the lengths of the intervals are τ 1 = 1/3 and τ 2 = 2/3 respectively, we obtain
λτ = 10/3 and λτ = 20/3. Thus, we have
1 2

10 20
− 10 3 − 20 7
e 3
( ) e 3
( )
3 3
P (3 arrivals in I1    and   7 arrivals in I2 ) = ⋅
3! 7!

≈ 0.0325

Second Definition of the Poisson Process:


Let N (t) be a Poisson process with rate λ. Consider a very short interval of length Δ. Then, the
number of arrivals in this interval has the same distribution as N (Δ). In particular, we can
write
−λΔ
P (N (Δ) = 0) = e
2
λ
2
= 1 − λΔ + Δ − ⋯ (Taylor Series).
2

Note that if Δ is small, the terms that include second or higher powers of Δ are negligible
compared to Δ. We write this as

P (N (Δ) = 0) = 1 − λΔ + o(Δ) (11.1)

Here o(Δ) shows a function that is negligible compared to Δ, as Δ → 0. More precisely, g(Δ) = o(Δ)
means that
g(Δ)
lim = 0.
Δ→0 Δ

Now, let us look at the probability of having one arrival in an interval of length Δ.
−λΔ
P (N (Δ) = 1) = e λΔ

2
λ 2
= λΔ (1 − λΔ + Δ − ⋯) (Taylor Series)
2

3
λ
2 2 3
= λΔ + (−λ Δ + Δ ⋯)
2

= λΔ + o(Δ).

We conclude that

P (N (Δ) = 1) = λΔ + o(Δ) (11.2)

Similarly, we can show that

P (N (Δ) ≥ 2) = o(Δ) (11.3)

In fact, equations 11.1, 11.2, and 11.3 give us another way to define a Poisson process.
The Second Definition of the Poisson Process

Let λ > 0 be fixed. The counting process {N (t), t ∈ [0, ∞)} is called a Poisson process with rate λ if
all the following conditions hold:

1. N (0) = 0;
2. N (t) has independent and stationary increments
3. we have
P (N (Δ) = 0) = 1 − λΔ + o(Δ),

P (N (Δ) = 1) = λΔ + o(Δ),

P (N (Δ) ≥ 2) = o(Δ).

We have already shown that any Poisson process satisfies the above definition. To show that
the above definition is equivalent to our original definition, we also need to show that any
process that satisfies the above definition also satisfies the original definition. A method to
show this is outlined in the End of Chapter Problems.
Arrival and Interarrival Times:

Let N (t) be a Poisson process with rate λ. Let X be the time of the first arrival. Then,
1

P (X1 > t) = P (no arrival in (0, t])

−λt
= e .

We conclude that
−λt
⎧ 1 − e t > 0

FX (t) = ⎨
1



0 otherwise

Therefore, X ∼ Exponential(λ). Let X be the time elapsed between the first and the second
1 2

arrival (Figure 11.4).

Figure 11.4 - The random variables X , X , ⋯ are called the interarrival times of the
1 2

counting process N (t).

Let s > 0 and t > 0. Note that the two intervals (0, s] and (s, s + t] are disjoint. We can write

P (X2 > t|X1 = s) = P (no arrival in (s, s + t]|X1 = s)

= P (no arrivals in (s, s + t])

(independent increments)

−λt
= e .

We conclude that X ∼ Exponential(λ), and that X and X are independent. The random variables
2 1 2

X , X , ⋯ are called the interarrival times of the counting process N (t) . Similarly, we can argue
1 2

that all X 's are independent and X ∼ Exponential(λ) for i = 1, 2, 3, ⋯.


i i

Interarrival Times for Poisson Processes

If N (t) is a Poisson process with rate λ, then the interarrival times X , X , ⋯ are independent
1 2

and

Xi ∼ Exponential(λ),  for i = 1, 2, 3, ⋯ .

Remember that if X is exponential with parameter λ > 0, then X is a memoryless random


variable, that is

P (X > x + a|X > a) = P (X > x),  for a, x ≥ 0.

Thinking of the Poisson process, the memoryless property of the interarrival times is
consistent with the independent increment property of the Poisson distribution. In some sense,
both are implying that the number of arrivals in non-overlapping intervals are independent.
To better understand this issue, let's look at an example.

Example 11.2
Let N (t) be a Poisson process with intensity λ = 2, and let X , X , ⋯ be the corresponding
1 2

interarrival times.
a. Find the probability that the first arrival occurs after t = 0.5, i.e., P (X > 0.5). 1

b. Given that we have had no arrivals before t = 1, find P (X > 3). 1

c. Given that the third arrival occurred at time t = 2, find the probability that the fourth
arrival occurs after t = 4.
d. I start watching the process at time t = 10. Let T be the time of the first arrival that I see. In
other words, T is the first arrival after t = 10. Find ET and Var(T ) .
e. I start watching the process at time t = 10. Let T be the time of the first arrival that I see.
Find the conditional expectation and the conditional variance of T given that I am
informed that the last arrival occurred at time t = 9.

Solution
a. Since X 1 ∼ Exponential(2) , we can write
−(2×0.5)
P (X1 > 0.5) = e

≈ 0.37

Another way to solve this is to note that


−(2×0.5)
P (X1 > 0.5) = P (no arrivals in (0, 0.5]) = e ≈ 0.37

b. We can write
P (X1 > 3|X1 > 1) = P (X1 > 2) (memoryless property)

−2×2
= e

≈ 0.0183

Another way to solve this is to note that the number of arrivals in (1, 3] is
independent of the arrivals before t = 1. Thus,

P (X1 > 3|X1 > 1) = P (no arrivals in (1, 3] | no arrivals in (0, 1])

= P (no arrivals in (1, 3]) (independent increments)

−2×2
= e

≈ 0.0183

c. The time between the third and the fourth arrival is X 4 ∼ Exponential(2) . Thus,
the desired conditional probability is equal to
P (X4 > 2|X1 + X2 + X3 = 2) = P (X4 > 2) (independence of the Xi 's)

−2×2
= e

≈ 0.0183

d. When I start watching the process at time t = 10, I will see a Poisson process.
Thus, the time of the first arrival from t = 10 is Exponential(2). In other words, we
can write

T = 10 + X,

where X ∼ Exponential(2). Thus,


ET = 10 + EX

1 21
= 10 + = ,
2 2
Var(T ) = Var(X)

1
= .
4

e. Arrivals before t = 10 are independent of arrivals after t = 10. Thus, knowing


that the last arrival occurred at time t = 9 does not impact the distribution of
the first arrival after t = 10. Thus, if A is the event that the last arrival occurred
at t = 9, we can write
E[T |A] = E[T ]

21
= ,
2

Var(T |A) = Var(T )

1
= .
4

Now that we know the distribution of the interarrival times, we can find the distribution of
arrival times
T1 = X1 ,

T2 = X1 + X2 ,

T3 = X1 + X2 + X3 ,

More specifically, T is the sum of n independent Exponential(λ) random variables. In previous


n

chapters we have seen that if T = X + X + ⋯ + X , where the X 's are independent Exponential(λ)
n 1 2 n i

random variables, then T ∼ Gamma(n, λ). This has been shown using MGFs. Note that here n ∈ N.
n

The Gamma(n, λ) is also called Erlang distribution, i.e, we can write


Tn ∼ Erlang(n, λ) = Gamma(n, λ),  for n = 1, 2,

3, ⋯ .

The PDF of T , for n = 1, 2, 3, ⋯, is given by


n

n n−1 −λt
λ t e
fTn (t) = ,  for t > 0.
(n − 1)!

Remember that if X ∼ Exponential(λ), then


1
E[X] = ,
λ

1
Var(X) = .
2
λ

Since Tn = X1 + X2 + ⋯ + Xn , we conclude that


n
E[Tn ] = nE X1 = ,
λ
n
Var(Tn ) = nVar(Xn ) = .
2
λ

Note that the arrival times are not independent. In particular, we must have T 1 ≤ T2 ≤ T3 ≤ ⋯ .
Arrival Times for Poisson Processes

If N (t) is a Poisson process with rate λ, then the arrival times T , T , ⋯ have Gamma(n, λ) 1 2

distribution. In particular, for n = 1, 2, 3, ⋯, we have


n n
E[Tn ] = , and Var(Tn ) = .
2
λ λ

The above discussion suggests a way to simulate (generate) a Poisson process with rate λ. We
first generate i.i.d. random variables X , X , X , ⋯, where X ∼ Exponential(λ). Then the arrival
1 2 3 i

times are given by

T1 = X1 ,

T2 = X1 + X2 ,

T3 = X1 + X2 + X3 ,

← previous
next →

The print version of the book is available through Amazon here.

You might also like