Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
86 views11 pages

Renewal Theory

Uploaded by

ravcha19
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
86 views11 pages

Renewal Theory

Uploaded by

ravcha19
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

DMS625: Introduction to stochastic processes and their

applications
Sourav Majumdar

Renewal Theory

November 15, 2024

1 Introduction
While studying CTMC we saw that for any stochastic system the time spent at a state being inde-
pendent and exponentially distributed is equivalent to the system possessing the Markov property
and vice-versa. For a Poisson process the time spent at a state is the interarrival time, which was
also iid exponential. We will now consider a generalization of Poisson process, where the interarrival
times will follow an iid arbitrary distribution.
Formally, a counting process {N (t), t ≥ 0} is said to be a Renewal process if the interarrival
times, X1 , X2 , . . . , Xn ∼ F iid. The Poisson process is a Renewal process where F is the exponential
distribution. Naturally from above if F is not the exponential distribution, then N (t) will not
have the Markov property. So in general renewal processes need not be Markovian. Using renewal
processes we can model all the phenomena we considered while studying Poisson processes, such
as arrival of natural disasters, shocks, people in queues, traffics, etc. but we can model it as a
non-Markovian process.
We will now introduce some notation that we also came across during our discussion of Poisson
processes. Sn is the waiting time for the n − th arrival. S0 = 0 and,
n
X
Sn = Xi , n ≥ 1
i=1

Now say S3 ≤ t, i.e., the 3rd arrival happened by time t and S4 > t, i.e., the 4th arrival happened
after time t. What is N (t)?
Notice that since the third arrival happened before time t and the 4th arrival happened after
time t, it implies that at time t, there are only 3 arrivals. Therefore, N (t) = 3. We can therefore
conclude a much more general result,

N (t) = max{n : Sn ≤ t}

Exercises
1. Verify that N (t) ≥ n ⇔ Sn ≤ t., i.e., N (t) ≥ n implies Sn ≤ t and Sn ≤ t implies N (t) ≥ n.

Suppose after infinite time has passed, the number of arrivals is finite, i.e., N (∞) < ∞. This
implies that in an infinite amount of time there were only finitely many arrivals, and therefore
it follows that one of the interarrival times, say Xm , m < ∞ is infinite. We shall not consider

1
distributions F , where P (Xm = ∞) > 0, can occur. Therefore, Xm cannot be infinite and N (∞) =
∞.
The following identity holds for a Renewal process.

P(N (t) = n) = P(N (t) ≥ n) − P(N (t) ≥ n + 1)


= P(Sn ≤ t) − P(Sn+1 ≤ t) (1)

Example 1.1. Suppose X1 , X2 , . . . , Xn ∼ Geo(p), i.e., P(Xi = k) = p(1 − p)k−1 , k ≥ 1. That is the
interarrival times are geometrically
 distributed.
k−1 n
Verify that P(Sn = k) = n−1 p (1 − p)k−n .
Therefore, from (1),
⌊t⌋   ⌊t⌋  
X k−1 n k−n
X k − 1 n+1
P(N (t) = n) = p (1 − p) − p (1 − p)k−n−1
n−1 n
k=n k=n+1

Proposition 1.1. Let X be a non-negative and integer valued random variable, then,

X
E[X] = P(X ≥ n)
n=1

Proof.

X
E[X] = nP(X = n)
n=0

X
= 0 × P(X = 0) + nP(X = n)
n=1

X
= nP(X = n)
n=1
= P(X = 1) + 2P(X = 2) + 3P(X = 3) + 4P(X) + . . .
= P(X = 1)
+ P(X = 2) + P(X = 2)
+ P(X = 3) + P(X = 3) + P(X = 3)
+ P(X = 4) + P(X = 4) + P(X = 4) + P(X = 4)
+ ...
= P(X ≥ 1) + P(X ≥ 2) + P(X ≥ 3) + . . .
X∞
= P(X ≥ n)
n=1

We define the mean renewal function m(t) as,

m(t) = E[N (t)]

2
It also follows that,

m(t) = E[N (t)]


X∞
= P(N (t) ≥ n)
n=1
(Applying the above proposition)

X
= P(Sn ≤ t)
n=1

We now state two results without proofs,


1. m(t) uniquely determines the renewal process, i.e., each unique renewal process will have a
unique mean renewal function.
2. m(t) < ∞, ∀t < ∞, i.e., for every renewal process, the mean renewal function is finite in finite
time.

Renewal equation
We will derive a key identity for working with renewal processes which is referred to as the renewal
equation.
First observe that,
Z ∞
m(t) = E[N (t)] = E[N (t)|X1 = x]f (x)dx
0
d
In the above X1 is the first interarrival time and f (x) = dx F (x). Now note that if x < t,
E[N (t)|X1 = x] > 0, however if x > t, it follows that E[N (t)|X1 = x] = 0. Verify this.
Therefore,
Z t
m(t) = E[N (t)] = E[N (t)|X1 = x]f (x)dx (2)
0

For x < t, it follows that (Verify!),

E[N (t)|X1 = x] = 1 + E[N (t − x)]

Therefore, substituting the above in (2), we obtain the renewal equation,

Z t
m(t) = [1 + m(t − x)]f (x)dx
0
Z t
= F (t) + m(t − x)f (x)dx (3)
0

Example 1.2. Let Xi ∼ Unif(0, 1) i.i.d., 0 ≤ t ≤ 1. Find m(t).

3
Applying the renewal equation,
Z t
m(t) = F (t) + m(t − x)f (x)dx
0
Z t
=t+ m(t − x)dx
0
(Let y = t − x)
Z y
=t+ m(y)dy
0
(Differentiating)
dm(t)
= 1 + m(t)
dt
dm(t)
= dt
1 + m(t)
log(1 + m(t)) = t
(Note that m(0) = 0)
m(t) = et − 1

Limit theorems for Renewal processes


Theorem 1.1 (Strong Law of Large numbers (SLLN)). Let Xi be i.i.d. random variables and
E[Xi ] = µ < ∞. Define,
Sn = X1 + X2 + . . . + Xn
Then,
Sn
P( lim = µ) = 1
n→∞ n
This is also alternatively expressed as,
Sn
→ µ, with probability 1 (also written as “w.p. 1”)
n
The strong law of large numbers says that the arithmetic mean of a large number of iid random
variables converges to the expected value of the random variable.
Theorem 1.2. Let Xi ’s denote the interarrival times of a renewal process and E[Xi ] = µ.

N (t) 1
→ as t → ∞ w.p. 1
t µ
Proof. Verify that,
SN (t) ≤ t ≤ SN (t)+1
Now,
SN (t) t SN (t)+1 N (t) + 1
≤ ≤
N (t) N (t) N (t) + 1 N (t)
Consider,
SN (t) X1 + X2 + . . . + XN (t)
lim = lim = µ (w.p.1, by SLLN)
t→∞ N (t) t→∞ N (t)

4
Similarly,
SN (t)+1
lim = µ (w.p.1, by SLLN)
t→∞ N (t) + 1
and,
N (t) + 1 1
lim = lim 1 + = 1 (Since N (∞) = ∞)
t→∞ N (t) t→∞ N (t)
Now using these and applying the sandwich theorem we obtain the result.

Exercises
1. Suppose the lifetime of a bulb is distributed Unif(30, 60) days. Each time a bulb breaks down,
we replace it with another bulb. Find the long-run rate (bulb replaced per day) of replacing
the bulbs.
2. In the previous question now assume that after the bulb breaks down, you go to the market to
buy a bulb. This buying time is distributed as Unif(0, 1) days. Find the long-run rate (bulb
replaced per day) of replacing the bulbs.

Theorem 1.3 (Wald’s theorem). Let Xi′ s be iid random variables. Let N be a positive integer valued
random variable. Assume that N and Xi′ s are independent. Then,

XN
E[ Xi ] = E[N ]E[Xi ]
i=1

Wald’s theorem gives the expectation of a random sum, i.e., the number of terms to be summed
are random (depends on N , which is a random variable). We can derive the compound Poisson
process result for the expectation using Wald’s theorem.
Proposition 1.2. For a renewal process N (t) where E[X1 ] = µ,

E[SN (t)+1 ] = µ(m(t) + 1)

Proof. Apply Wald’s theorem. Left as an exercise.


We will now state a result that is referred to as the elementary renewal theorem,
Theorem 1.4 (Elementary renewal theorem).

m(t) 1
→ as t → ∞
t µ
Proof. Recall,
t ≤ SN (t)+1
SN (t)+1 = t + Y (t), here Y (t) denotes the excess time after t to get the N (t) + 1 arrival. Note
that Y (t) ≥ 0.

E[SN (t)+1 = E[t + Y (t)]]


µ(m(t) + 1) = t + E[Y (t)]

Rearranging,
m(t) 1 E[Y (t)] 1
= + −
t µ tµ t

5
Since, Y (t) ≥ 0, therefore, E[Y (t)] ≥ 0. Hence,
m(t) 1 1
≥ −
t µ t
Therefore,
m(t) 1
lim ≥
t→∞ t µ
Now note that Y (t) ≤ XN (t)+1 . (Thanks to Aditya Kumar for this idea). Therefore,
E[Y (t)] ≤ E[XN (t)+1 ] = µ
Therefore,
m(t) 1 E[XN (t)+1 ] 1 1 µ 1 1
≤ + − = + − =
t µ tµ t µ tµ t µ
and,
m(t) 1
lim ≤
t
t→∞ µ
Now apply the sandwich theorem and conclude the result.

Remark 1. Instead of the proof above you might be tempted to apply expectation directly to Theorem
1.2. However that would be an incorrect application even though in this case you seem to obtain the
correct result.
Let Yn be a sequence of random variables. In general it is not true that limn→∞ E[Yn ] and
limn→∞ Yn are equal. To see this, consider the following. Let U ∼ Unif(0, 1).
(
0 U > n1
Yn =
n U ≤ n1
Now note here that Yn → 0 as n → ∞, w.p.1 and E[Yn ] = 1. Verify these.
We now state the central limit theorem for renewal processes without proof,
Theorem 1.5 (CLT for renewal processes). Let N (t) be a renewal process. Then,
!
t tσ 2
lim N (t) ∼ N ,
t→∞ µ µ3
t
Remark 2. The mean of the limiting distribution is µ, i.e.,
t
lim E[N (t)] →
t→∞ µ
Note that this is in agreement with the conclusion of the first limit theorem for renewal processes.
The another additional insight we obtain from the CLT for renewal processes is,
tσ 2
lim Var[N (t)] →
t→∞ µ3

Exercises
1. Find the limiting distribution for large t for a Poisson process.
2. Let N1 (t) and N2 (t) be two independent renewal processes. The interarrival times for N1 (t) are
Unif(0, 1) and the interarrival times for N2 (t) are Exp(5). Find the approximate distribution
of N1 (500) + N2 (500).

6
Renewal-reward processes
Renewal-reward processes may be considered as the generalization of compound Poisson processes.
Think of an insurer that receives claims following a renewal process. But the size of each claim
would be different. So there are two sources of randomness, first the arrival of claims itself is
random following a renewal process and secondly the size of the claims are also random. We would
want to understand at any point in time, what is the total size of claims that the insurer has received.
Formally, let the arrivals follow a renewal process N (t) and the i − th arrival brings with it
a reward Ri (think of size of claims from the above example). Then at time t the total rewards
received is,
N (t)
X
R(t) = Ri
i=1

We assume that Ri ’s are iid random variables. Let Xi ’s denote the interarrival times for N (t).
Then E[Xi ] = E[X] and E[Ri ] = E[R].
Proposition 1.3. Let E[R] < ∞ and E[X] < ∞, then,

R(t) E[R]
lim → w.p.1
t→∞ t E[X]

Proof. Apply SLLN and the first limit theorem. Left as an exercise.

We state the second result without proof.


Proposition 1.4. Let E[R] < ∞ and E[X] < ∞, then,

E[R(t)] E[R]
lim →
t→∞ t E[X]

Exercises
1. Suppose the lifetime, X, of a car is distributed via a distribution with CDF H(x) and the
density is h(x). The owner buys a new car when it breaks down or the life of the car reaches
T . The owner incurs a cost of C1 if the life of the car reaches T and C1 + C2 if it breaks down.
C1 +C2 H(T )
Show that the long-run time-average cost of replacing a car is R T xh(x)dx+T (1−H(T ))
.
0

2. In the above problem, X ∼ Unif(0, 10). C1 = 3 and C2 = 1/2. Find T that minimizes long-run
average cost of replacing a car.

Age and Excess of a Renewal process


We define the age of a renewal process as A(t) = t − SN (t) and the excess of a renewal process as
Y (t) = SN (t)+1 − t. The figure below gives a pictorial illustration of the age and excess.

7
We are in particular interested in understanding the long-run time-average excess given by,
Rt
Y (s)ds
lim 0
t→∞ t
and the long-run time-average age given by,
Rt
A(s)ds
lim 0
t→∞ t
We will soon see the significance both these quantities hold. The following presentation and
proof is adapted from the MIT OCW notes on Renewal processes.
We are interested in first understanding the integral,
Z t
Y (s)ds
0

Take a look at the figure below,

8
The top plot has N (t) on the y-axis and t on the x-axis. It shows how N (t) is changing in time
with each arrival and the plot below shows the excess time for each arrival on the y-axis and the
time on the x-axis. Verify if the plot below makes sense. Each triangle in the figure is a right-angled
isosceles triangle. Now note that,

t N (t)
Z X X2 Z t
i
Y (u)du = + Y (u)du
0 i=1
2 SN (t)

The first term is just the area under the triangles for the first N (t) arrivals. The second term
appears because N (t) + 1’th arrival has not yet occurred and therefore its corresponding triangle
will not appear, we denote itsRarea by an integral.
t
Since Y (t) ≥ 0, therefore, SN (t) Y (u)du ≥ 0 and it follows that,

t N (t)
Z X X2
i
Y (u)du ≥ (4)
0 i=1
2

Since the N (t) + 1’th arrival has not yet occurred, therefore,
2 t
XN (t)+1
Z
≥ Y (u)du
2 SN (t)

Verify this. Hence, we obtain,

t N (t) 2 N (t)+1
Z X X2
i
XN (t)+1
X X2
i
Y (u)du ≤ + =
0 i=1
2 2 i=1
2

Therefore,
N (t) Z t N (t)+1
X X2 X X2
i i
≤ Y (u)du ≤ (5)
i=1
2 0 i=1
2
We are interested in the long-run time-average, therefore, dividing by t and taking limit,
N (t) N (t)+1
X X2 1 t X X2
Z
i i
lim ≤ lim Y (u)du ≤ lim (6)
t→∞
i=1
2t t→∞ t 0 t→∞
i=1
2t

Verify that (apply SLLN and first limit theorem for renewal processes),
N (t)
X X2 E[X 2 ]
i
lim → w.p.1
t→∞
i=1
2t 2E[X]

and,
N (t)+1
X X2 E[X 2 ]
i
lim → w.p.1
t→∞
i=1
2t 2E[X]

Here E[Xi2 ] = E[X 2 ]. Now applying sandwich theorem we conclude that,


Proposition 1.5.
t
E[X 2 ]
Z
1
lim Y (u)du → w.p.1
t→∞ t 0 2E[X]

9
and,
Proposition 1.6.
t
E[X 2 ]
Z
1
lim Y (u)du → w.p.1
t→∞ t 0 2E[X]
Proof. Left as an exercise. Note that the plot of A(t) will be a “reflection” of the plot of Y (t), all
else remains same.

Inspection Paradox
The inspection paradox says the following: suppose buses arrive at a bus station following a Renewal
process and you arrive at the station to catch a bus, then you are more likely to observe a delay in
catching the bus than it is while waiting for an ordinary bus.
Now note here that the interarrival time for the bus that you wait for will be XN (t) where at
time t you don’t know how many buses have already passed. By waiting for an ordinary bus we
mean that the interarrival time is Xi , where i is some known number. Note that N (t) is random,
therefore we cannot conclude that XN (t) and Xi are identically distributed.
Infact they are not identically distributed. Formally the inspection paradox is the following,
Proposition 1.7.
P(XN (t)+1 ≥ x) ≥ P(X1 ≥ x)
Proof.
Z ∞
P(XN (t)+1 > x) = P(XN (t)+1 > x|SN (t) = s)fSN (t) (s)ds
Z0 ∞
= P(XN (t)+1 > x|XN (t)+1 > t − s)fSN (t) (s)ds
0

P(XN (t)+1 > x, XN (t)+1 > t − s)
Z
= fSN (t) (s)ds
0 P(XN (t)+1 > t − s)

1 − F (max(x, t − s))
Z
= fSN (t) (s)ds
0 1 − F (t − s)
Z ∞  
1 − F (x) 1 − F (t − s)
= min , fSN (t) (s)ds
0 1 − F (t − s) 1 − F (t − s)
Z ∞  
1 − F (x)
= min , 1 fSN (t) (s)ds
0 1 − F (t − s)
Z ∞
≥ (1 − F (x))fSN (t) (s)ds
0
1 − F (x)
(Note that 1 − F (x) ≤ 1 and 1 − F (x) ≤ , since 1 − F (t − s) ≤ 1.
1 − F (t − s)
 
1 − F (x)
Therefore, 1 − F (x) ≤ min ,1 )
1 − F (t − s)
Z ∞
= (1 − F (x)) fSN (t) (s)ds = (1 − F (x))
0
(fSN (t) (s) is a probability density function)
= P(X1 ≥ x)

10
This result says that the probability for waiting more than x unit of time for the N (t) + 1-th
bus is greater than waiting for an ordinary bus. Here ordinary is in the sense we described above.
It also follows that,
Proposition 1.8.
E[XN (t)+1 ] ≥ E[X1 ]

Proof. We will first integrate the LHS of the above inequality, P(XN (t)+1 ≥ x),
Z ∞ Z ∞ Z ∞
P(XN (t)+1 ≥ x)dx = fXN (t)+1 (u)dudx
0
Z0 ∞ Zxu
= fXN (t)+1 (u)dxdu
0 0
(Change order of integration)
Z ∞
= ufXN (t)+1 (u)du
0
= E[XN (t)+1 ]

Similarly integrating the RHS of the above inequality we can conclude,

E[XN (t)+1 ] ≥ E[X1 ]

The above can also be proved using integration by parts. Try this! (Thanks to Keshav Ranjan
and Divyansh Agarwal for this suggestion.)

Regenerative processes
A regenerative process is a generalization of Renewal processes. Renewal processes measure the
counts of occurrences of random events, therefore the state-space of the renewal process is the set
of non-negative integers. Regenerative processes can have more general state-spaces including the
set of real numbers.
A stochastic process X(t) is said to be a regenerative process, if there is a sequence of random
times T1 , T2 , . . ., which are referred to as the regeneration times. Then,
1. X(t) and X(t + Ti ) have the same distribution.

2. X(t + Ti ) is independent of the past i.e., it is independent of {X(t), 0 ≤ t < Ti }


Some examples of regenerative processes include the renewal process (the arrival of each unit is
the regeneration time), age and excess process (again the arrival of each unit is the regeneration
time).

References
1. Sheldon Ross, Introduction to Probability Models, Academic Press, 2024.
2. Rick Durrett, Essentials of Stochastic Processes, Springer, 1999.

11

You might also like