APPLIED STOCHASTIC PROCESSES
Instructor: Dr. Bohai Zhang
Office: T3-501-R10
Email:
[email protected] TA: Mr. Yida Gu
[email protected] n
Course Introduction
Textbook
Stochastic Process,
Sheldon M. Ross, 2nd ed.,
John Wiley& Sons, Inc.,
1996
Course Introduction
Reference:
1. Introduction to
Probability Models.
Sheldon M. Ross,
Academic Press, 9th ed.,
2007. (7th and 8th edition
can also be used)
Course Introduction
Reference:
2. An Introduction to
Stochastic Processes, .
Kao, Edward P.C.,
Wadsworth Publishing
Company, 1997.
Content:
Content:
1.Review of probability theory
2 Poisson process
3. Markov chain process (discrete-
time)
4. Markov chain process (continuous-
time)
5. Renewal process
6. Brownian motion and other
diffusion processes
Course Introduction
CILOs-PILOs Mapping Matrix
Course Code & Title: STAT4110 Applied Stochastic
Process
PILO(s)
Upon successful completion of
to be
CILO the course, students should be
addresse
able to:
d
Explain the basic concepts of
CILO 1 stochastic process PILO 2,3
Evaluate the application of
Finite Markov chain, Infinite
CILO 2 PILO 2,3
Markov chain and some
branching processes
Appraise the stochastic
PILO 2,
CILO 3 processes through solving real
life problems 3, 4
Assessment
Class Discussion and Participation
( 10 % ).
Assignment ( 20 %)
In class exercises ( 30% )
Final Examination ( 40 % ).
Prediction of weather and climate
State Space = {SUNNY,
RAINNY}
X day i " S " or " R " : RANDOM VARIABLE that varies with the DAY
X day 2 "S " X day 4 "S " X day 6 "S "
X day 1 "S " X day 3 " R " X day 5 " R " X day 7 "S "
Day
Day 1 Day 2 Day 3 Day 4 Day 5 Day 6 Day 7
THU FRI SAT SUN MON TUE WED
X day i IS A STOCHASTIC PROCESS
X(dayi): Status of the weather observed each DAY
Gambler’s Ruin Problem
Consider a gambler John who starts with an initial
fortune of $2 and then on each successive gamble either
wins $1 or loses $1 independent of the past with
probabilities p=0.6 and q=1-p=0.4 respectively.
What is the probability that John obtains a fortune of 4
without going broke?
What is the probability that John will become infinitely
rich?
If John instead started with an initial fortune of $1, what
is the probability that he would go broke?
Population model (Branching Process)
A population begins with a single
individual. In each generation, the
number of offspring's of an individual is
0, 1, 2 with respectively probabilities
a > 0, b > 0 and c > 0, where a + b + c
= 1.
Let X denote the number of individuals
n
in the population in the nth generation.
X 0 1 X 0 1
Find the mean and variance of X
n Y22 Y32
X 1 3 Y12
X 2 9
Y13 Y23 Y33 Y43 Y53 Y63 Y73 Y83 Y93
Stock Price
Geometric Brownian Motion (GBM)
Stock price returns follow
stochastic process:
dSt Stdt StPercenta
dWt
ge of
Change
Drift volatility
in the
Coefficie ( impact
Stock
nt of
Price
random
term )
Other stochastic processes
Birth and death process
Queueing process
Poisson process
Jump diffusion process
…
CHAPTER 1
PRELIMINARIES
§1.1. Probability.
Sample space and Events
A sample space (Ω) of a random
experiment
a set of all possible outcomes of an
experiment
An event E
any subset of the sample space
Axioms of probability
The axioms of probability are that there is a
“sample space” Ω (or S) containing all
possible outcomes and a function P that
assigns events (E) to a number in [0, 1]
such that
Conditional Probability
then
Conditional Probability
Conditional Probability
Conditional Probability
Conditional Probability
Bayes' formula
For a partition {Ei} of the sample space and
an event F,
Independence
Two events E and F are independent if
P{EÅ F} = P{E}P{F}.
Consequently, we have P{F|E} = P{F}.
Exercise 1.1.1
An economics consulting firm has
created a model to predict recessions.
The model predicts a recession with
probability 80% when a recession is
indeed coming and with probability 10%
when no recession is coming. The
unconditional probability of falling into a
recession is 20%. If the model predicts a
recession, what is the probability that a
recession will indeed come?
Solution
What we know about this problem can be formalized as follows:
The unconditional probability of predicting a recession can be derived using the law
of total probability:
Using Bayes' rule we obtain:
§1.2. Random
Variables.
§1.2. Distributions.
A c.d.f. is a non-decreasing, left continuous function
such that
Remark: It is right continuous if you define a c.d.f
when you use “<=” (your text book does) instead of
using “<” .
P( X B) F( B) dF( x )
B
§1.2. Distributions.
§1.2. Distributions.
For two random variables X and Y , their
joint distribution function is
If
then fx,y is called the joint density of X
and Y .
§1.2. Distributions.
Let Xi be independent and identically distributed
(i.i.d.) R.V.s with distribution F(x).
Distribution of the maximum
Distribution of the minimum
§1.2. Distributions.
§1.2. Distributions.
§1.2. Distributions.
§1.3 Expected Value
§1.3 Expected Value
§1.3 Expected Value
§1.3 Expected Value
§1.3 Expected Value
§1.3 Expected Value
§1.3 Expected Value
§1.3 Expected Value.
§1.3 Expected Value.
§1.3 Expected Value.
用协方差公式去记忆
如果协方差等于 1 ,说明 x 和 y 在一条
线上,倒过来也正确
Exercise
Exercise 1.3.1: If X is a random variable with normal law N(0, σ2)
and λ is a real number,
Exercise
Exercise 1.3.2:
At a party n people put their hats in the
center of a room where the hats are mixed
together. Each person then randomly
selects one. We are interested in the mean
and variance of X-the number that select
their own hat.
Textbook: Page 10, example 1.3
§1.4 Moment Generating, Characteristic
Functions, and Laplace Transforms.
§1.4 Moment Generating, Characteristic
Functions, and Laplace Transforms.
Binomial Distribution Moment Generating Function
Usefulness of Mgf
The uses of Moment generating
1.functions
Provide another tool for computing moments.
2. Provide a method for determining uniqueness and
convergence of distributions.
2010 Summer School June/2010---July/2010
§1.4 Moment Generating, Characteristic
Functions, and Laplace Transforms.
Exercise 1.4.1 : Use Moment generation function to
compute the mean and second moment of Normal
distribution N(μ,σ2)
Solution
Solution
The moment generating function of a standard normal
random variable Z is obtained as follows.
Solution
(cont.)
§1.4 Moment Generating, Characteristic
Functions, and Laplace Transforms.
The characteristic function of a random
variable X is defined by
The moments of a random variable can be
computed from the derivatives of the
characteristic function at the origin:
for n =1,2,3….
Remarks: Why do we need characteristic function
since we have distribution function and Mgf.
§1.4 Moment Generating, Characteristic
Functions, and Laplace Transforms.
When dealing with random variables that only assume
nonnegative values, it is sometimes more convenient to use
Laplace Transforms.
§1.4 Moment Generating, Characteristic
Functions, and Laplace Transforms.
From the equation, we conclude that
Exercise (1.4.2) Let X be a random variable having the
exponential density with
parameter μ, that is
The Laplace transform of the density is given by
Find E[X] and Var[X].
§1.5. Conditional Expectation.
Suppose that X and Y are discrete. Then
is called the conditional distribution of X given Y
= y.
The conditional distribution function of X given
Y=y is defined by
F(x | y)=P(X ≤ x | Y = y)
It has, thus, an expectation,
§1.5. Conditional Expectation.
We can compose the function y E(X|Y=y) with
Y to get a random variable denoted E(X|Y). We
have
Proof: Write it out.
§1.5. Conditional Expectation.
Suppose that X and Y are continuous. Then
§1.5. Conditional Expectation.
§1.5 Example (Miner)
Example 1.5.1: A miner is trapped in a mine
containing three doors. The first door leads
to a tunnel that takes him to safety after two
hours of travel. The second door leads to a
tunnel that returns him to the mine after
three hours of travel. The third door leads to
a tunnel that returns him to his mine after
five hours. Assuming that the miner is at all
times equally likely to choose any one of the
doors, what is the expected length of time
until the miner reaches safety?
Solution:
Example §1.5 (The Sum of a Random
Number of Random Variables).
Let Xi be i.i.d. (i ≥1) and N be an independent random
N
Y XLet
variable with values in N={0,1,2,3…}. i
i 1
Application:
Example §1.5(A) (The Sum of a
Random Number of Random
Variables).
Example 1.5.2: Suppose that the expected
number of accidents per week at an industrial plant
is four. Suppose also that the numbers of workers
injured in each accident are independent random
variables with a common mean of 2. Assume also
that the number of workers injured in each accident
is independent of the number of accidents that
occur. What is the expected number of injuries
during a week?
Solution:
Solution:
Solution (cont.):
Example §1.5(A) (The Sum of a
Random Number of Random
Variables).
Alternative solution: We also can compute
moments of Y computing the moment generating
function:
Example §1.5(A) (The Sum of a
Random Number of Random
Variables).
so
and
§1.5. Bayes Estimation.
§1.6. The Exponential Distribution,
Lack of Memory, and Hazard Rate
Functions.
Recall that X~ Exp(λ) ( is called the
parameter or rate) if it has probability
density function
Memoryless:
Example 1.6.1 Suppose that the amount of
time one spends in a bank is exponentially
distributed with mean ten minutes, that is,
λ=1/10 . What is the probability that a customer
will spend more than fifteen minutes in the bank?
What is the probability that a customer will spend
more than fifteen minutes in the bank given that
she is still in the bank after ten minutes?
Solution: If X represents the amount of time that the
customer spends in the bank, then the first probability is
just
Solution (cont.):
The second question asks for the probability that a
customer who has spent ten minutes in the bank will
have to spend at least five more minutes. However,
since the exponential distribution does not “remember”
that the customer has already spent ten minutes in the
bank, this must equal the probability that an entering
customer spends at least five minutes in the bank. That
is, the desired probability is just
§1.6. The Exponential Distribution,
Lack of Memory, and Hazard Rate
Functions.
Example 1.6.2: consider a post office that is run
by two clerks. Suppose that when A enters the
system he discovers that B is being served by
one of the clerks and C by the other. Suppose
also that A is told that his service will begin as
soon as either B or C leaves. If the amount of
time that a clerk spends with a customer is
exponentially distributed with mean 1/λ, what is
the probability that, of the three customers, A is
the last to leave the post office?
Solution:
The answer is obtained by this reasoning:
Consider the time at which A first finds a free
clerk. At this point either B or C would have just
left and the other one would still be in service.
However, by the lack of memory of the
exponential, it follows that the amount of time
that this other man (either B or C) would still
have to spend in the post office is exponentially
distributed with mean 1/λ. That is, it is the same
as if he were just starting his service at this point.
Hence, by symmetry, the probability that he
finishes before Smith must equal ½.
§1.6. Hazard Rate Functions.
Consider a continuous positive random variable X
having distribution function F and density f . The
failure (or hazard) rate function r(t) is defined by
To interpret r(t), suppose that an item, having lifetime X,
has survived for t hours, and we desire the probability that
it does not survive for an additional time dt. That is,
consider P{X ∈ (t, t +dt)|X>t}. We can prove
That is, r(t) represents the conditional probability
density that a t-year-old item will fail.
§1.6. Hazard Rate Functions.
Suppose that the lifetime distribution is
exponential.
By the memoryless property, it follows that the
distribution of remaining life for a t-year-old item
is the same as for a new item. Hence r(t) should
be constant and it is λ.
§1.6. Hazard Rate Functions.
§1.7 some basic inequalities of
probability theory
§1.7 Markov’s Inequality
• The simple Markov’s inequality is a first-order
inequality since only
knowledge of E[x] is required
• The simple Markov’s inequality is quite weak but
can be used to quickly check statements made
about the tail of a distribution of a random variable
when the expectation is known
Example. If the expected response time of a
computer system is 1 second, then the simple
Markov’s inequality shows that P[x ≥ 10] ≤ 0.1 and
thus at most 10% of the response times in the system
can be greater than 10 seconds (时间是一个重要的提示,因为是单线程的,刚好满足 simple 卡
§1.7 some basic inequalities of
probability theory
byshev’s Inequality – second-order bound
If x is a random variable with mean μ and variance σ2, k >0, then
Chernoff’s Bound
§1.7 some basic inequalities of
probability theory
Jensen’s Inequality. If h is a convex function,
defined on real variables, then
h(y)
it is convex
it is concave
§1.8. Limit Theorems.
Strong Law of Large Numbers
• Central Limit Theorem
I.e.,
§1.9. Stochastic process
A “stochastic process” X = {x(t), t ∈ T} is a collection of
random
variables. That is, for each t ∈ T, x(t) is a random variable.
The index t is often interpreted as “time” and, as a result, we
refer to x(t) as the “state” of the process at time t.
When the index set T of the process X is
– a countable set → X is a discrete-time process
– an interval of the real line → X is a continuous-time process
When the state space S of the process X is
– a countable set → X has a discrete state space
– an interval of the real line → X has a continuous state
space
§1.9. Stochastic process
Four types of stochastic processes
– discrete time and discrete state space
– continuous time and discrete state
space
– discrete time and continuous state
space
– continuous time and continuous state
A stochastic process is a family of random
space
variables that describes the evolution through time
of some process. We shall see much of stochastic
processes in the following chapters of this text.
§1.9. Classification of stochastic
processes
A sample path of a discrete- A sample path of a continuous-time
time process with a discrete state space
process with a discrete state
space
X(t)---closing price of a X(t)--- price of a company
company stock at time t on a given
stock on day t day
§1.9. Classification of stochastic
processes
A sample path of a continuous-time A sample path of a discrete-time
process with a continuous state spaceprocess with a continuous state space
X(t)---temperature at the airport X(t)--- temperature at the
at time t airport at time t
§1.9. stochastic processes
§1.9. Stochastic process
A real-valued process {Xt, t> 0} is called a second
order process provided for all t T.
The mean and the covariance function of a second
order process {Xt, t> 0} are defined by
The variance of the process {Xt, t> 0} is defined by
§1.9. Stochastic process
xample 1.9.1 Consider the stochastic process
4
2
X
0
-2
0 100 200 300
§1.9. Stochastic process
§1.9. Stochastic process