Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
326 views10 pages

PRP Notes

The document discusses probability and random processes. It covers topics such as discrete and continuous random variables, probability mass functions, probability density functions, expected value, variance, and moments. It also discusses joint, marginal, and conditional distributions. Other topics include the central limit theorem, Markov processes, Markov chains, and classification of random processes as continuous, discrete, continuous random sequences, and discrete random sequences. Examples are provided for different types of random variables and processes.

Uploaded by

Bharath Jojo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
326 views10 pages

PRP Notes

The document discusses probability and random processes. It covers topics such as discrete and continuous random variables, probability mass functions, probability density functions, expected value, variance, and moments. It also discusses joint, marginal, and conditional distributions. Other topics include the central limit theorem, Markov processes, Markov chains, and classification of random processes as continuous, discrete, continuous random sequences, and discrete random sequences. Examples are provided for different types of random variables and processes.

Uploaded by

Bharath Jojo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

SUBJECT NAME : Probability & Random Process

SUBJECT CODE : MA 2261


MATERIAL NAME : Formula Material
MATERIAL CODE : JM08AM1007

UNIT-I (RANDOM VARIABLES)

1) Discrete random variable:
A random variable whose set of possible values is either finite or countably
infinite is called discrete random variable.
Eg: (i) Let X represent the sum of the numbers on the 2 dice, when two
dice are thrown. In this case the random variable X takes the values 2, 3, 4, 5, 6,
7, 8, 9, 10, 11 and 12. So X is a discrete random variable.
(ii) Number of transmitted bits received in error.
2) Continuous random variable:
A random variable X is said to be continuous if it takes all possible values
between certain limits.
Eg: The length of time during which a vacuum tube installed in a circuit
functions is a continuous random variable, number of scratches on a surface,
proportion of defective parts among 1000 tested, number of transmitted in
error.
3)
Sl.No. Discrete random variable Continuous random variable
1
( ) 1
i
i
p x

=
=


( ) 1 f x dx

=
}

2
| | ( ) F x P X x = s
| | ( ) ( )
x
F x P X x f x dx

= s =
}

3
| | Mean ( )
i i
i
E X x p x = =


| | Mean ( ) E X xf x dx

= =
}

4
2 2
( )
i i
i
E X x p x ( =



2 2
( ) E X x f x dx

( =
}

5
( ) ( ) ( )
2
2
Var X E X E X ( =

( ) ( ) ( )
2
2
Var X E X E X ( =


6
Moment =
r r
i i
i
E X x p ( =



Moment = ( )
r r
E X x f x dx

( =
}

7 M.G.F M.G.F
( ) ( )
tX tx
X
x
M t E e e p x ( = =



( ) ( )
tX tx
X
M t E e e f x dx

( = =
}


4) ( ) ( ) E aX b aE X b + = +
5) ( ) ( )
2
Var Var aX b a X + =
6) ( ) ( ) ( )
2 2
Var Var aX bY a X bVar Y = +
7) ( ) Standard Deviation Var X =
8) ( ) ( ) f x F x ' =
9) ( ) 1 ( ) p X a p X a > = s
10) ( )
( )
( )
/
p A B
p A B
p B
= , ( ) 0 p B =
11) If A and B are independent, then ( ) ( ) ( ) p A B p A p B = .
12) 1
st
Moment about origin = | | E X =
( )
0
X
t
M t
=
( '

(Mean)
2
nd
Moment about origin =
2
E X (

=
( )
0
X
t
M t
=
( ''


The co-efficient of
!
r
t
r
=
r
E X (

(r
th
Moment about the origin)
13) Limitation of M.G.F:
i) A random variable X may have no moments although its m.g.f exists.
ii) A random variable X can have its m.g.f and some or all moments, yet the
m.g.f does not generate the moments.
iii) A random variable X can have all or some moments, but m.g.f does not
exist except perhaps at one point.
14) Properties of M.G.F:
i) If Y = aX + b, then ( ) ( )
bt
Y X
M t e M at = .
ii) ( ) ( )
cX X
M t M ct = , where c is constant.
iii) If X and Y are two independent random variables then
( ) ( ) ( )
X Y X Y
M t M t M t
+
= .
15) P.D.F, M.G.F, Mean and Variance of all the distributions:
Sl.
No.
Distributio
n
P.D.F ( ( ) P X x = )
M.G.F Mean Variance
1 Binomial x n x
x
nc p q


( )
n
t
q pe +

np npq
2 Poisson
!
x
e
x


( )
1
t
e
e



3 Geometric 1 x
q p

(or)
x
q p
1
t
t
pe
qe

1
p

2
q
p

4 Uniform
1
,
( )
0, otherwise
a x b
f x b a

< <


( )
bt at
e e
b a t


2
a b +

2
( )
12
b a

5 Exponential
, 0, 0
( )
0, otherwise
x
e x
f x

> >
=


2
1


6 Gamma 1
( ) , 0 , 0
( )
x
e x
f x x


= < < >
I

1
(1 ) t



7 Normal
2
1
2
1
( )
2
x
f x e

o
o t
| |
|
\ .
=

2 2
2
t
t
e
o
+


2
o

16) Memoryless property of exponential distribution
( ) ( ) / P X S t X S P X t > + > = > .
17) Function of random variable: ( ) ( )
Y X
dx
f y f x
dy
=

UNIT-II (RANDOM VARIABLES)

1) 1
ij
i j
p =

(Discrete random variable)


( , ) 1 f x y dxdy


=
} }
(Continuous random variable)
2) Conditional probability function X given Y { }
( ) ,
/
( )
i i
P x y
P X x Y y
P y
= = = .
Conditional probability function Y given X { }
( ) ,
/
( )
i i
P x y
P Y y X x
P x
= = = .
{ }
( ) ,
/
( )
P X a Y b
P X a Y b
P Y b
< <
< < =
<

3) Conditional density function of X given Y,
( , )
( / )
( )
f x y
f x y
f y
= .
Conditional density function of Y given X,
( , )
( / )
( )
f x y
f y x
f x
= .
4) If X and Y are independent random variables then
( , ) ( ). ( ) f x y f x f y = (for continuous random variable)
( ) ( ) ( ) , . P X x Y y P X x P Y y = = = = = (for discrete random variable)
5) Joint probability density function ( ) , ( , )
d b
c a
P a X b c Y d f x y dxdy s s s s =
} }
.
( )
0 0
, ( , )
b a
P X a Y b f x y dxdy < < =
} }

6) Marginal density function of X, ( ) ( ) ( , )
X
f x f x f x y dy

= =
}

Marginal density function of Y, ( ) ( ) ( , )
Y
f y f y f x y dx

= =
}

7) ( 1) 1 ( 1) P X Y P X Y + > = + <
8) Correlation co efficient (Discrete):
( , )
( , )
X Y
Cov X Y
x y
o o
=
1
( , ) Cov X Y XY XY
n
=

,
2 2
1
X
X X
n
o =

,
2 2
1
Y
Y Y
n
o =


9) Correlation co efficient (Continuous):
( , )
( , )
X Y
Cov X Y
x y
o o
=
( ) ( ) ( ) ( , ) , Cov X Y E X Y E X E Y = , ( )
X
Var X o = , ( )
Y
Var Y o =
10) If X and Y are uncorrelated random variables, then ( , ) 0 Cov X Y = .
11) ( ) ( ) E X xf x dx

=
}
, ( ) ( ) E Y yf y dy

=
}
, ( ) , ( , ) E X Y xyf x y dxdy


=
} }
.
12) Regression for Discrete random variable:
Regression line X on Y is ( )
xy
x x b y y = ,
( ) ( )
( )
2 xy
x x y y
b
y y

=


Regression line Y on X is ( )
yx
y y b x x = ,
( ) ( )
( )
2 yx
x x y y
b
x x

=


Correlation through the regression, .
XY YX
b b = Note: ( , ) ( , ) x y r x y =


13) Regression for Continuous random variable:
Regression line X on Y is ( ) ( ) ( )
xy
x E x b y E y = ,
x
xy
y
b r
o
o
=
Regression line Y on X is ( ) ( ) ( )
yx
y E y b x E x = ,
y
yx
x
b r
o
o
=
Regression curve X on Y is ( ) ( ) / / x E x y x f x y dx

= =
}

Regression curve Y on X is ( ) ( ) / / y E y x yf y x dy

= =
}

14) Transformation Random Variables:
( ) ( )
Y X
dx
f y f x
dy
= (One dimensional random variable)
( , ) ( , )
UV XY
x x
u v
f u v f x y
y y
u v
c c
c c
=
c c
c c
(Two dimensional random variable)
15) Central limit theorem (Liapounoffs form)
If X1, X2, Xn be a sequence of independent R.Vs with E[Xi] = i and Var(Xi) = i
2
, i
= 1,2,n and if Sn = X1 + X2 + + Xn then under certain general conditions, Sn
follows a normal distribution with mean
1
n
i
i

=
=

and variance
2 2
1
n
i
i
o o
=
=

as
n .
16) Central limit theorem (Lindberg Levys form)
If X1, X2, Xn be a sequence of independent identically distributed R.Vs with E[Xi]
= i and Var(Xi) = i
2
, i = 1,2,n and if Sn = X1 + X2 + + Xn then under certain
general conditions, Sn follows a normal distribution with mean n and variance
2
no as n .
Note:
n
S n
z
n

= ( for n variables),
X
z
n

= ( for single variables)


UNIT-III (MARKOV PROCESSES AND MARKOV CHAINS)

1) Random Process:
A random process is a collection of random variables {X(s,t)} that are
functions of a real variable, namely time t where s S and t T.

2) Classification of Random Processes:
We can classify the random process according to the characteristics of time t
and the random variable X. We shall consider only four cases based on t and X
having values in the ranges -< t < and - < x < .

Continuous random process
Continuous random sequence
Discrete random process
Discrete random sequence
Continuous random process:
If X and t are continuous, then we call X(t) , a Continuous Random Process.
Example: If X(t) represents the maximum temperature at a place in the
interval (0,t), {X(t)} is a Continuous Random Process.
Continuous Random Sequence:
A random process for which X is continuous but time takes only discrete values is
called a Continuous Random Sequence.
Example: If Xn represents the temperature at the end of the nth hour of a day, then
{Xn, 1n24} is a Continuous Random Sequence.
Discrete Random Process:
If X assumes only discrete values and t is continuous, then we call such random
process {X(t)} as Discrete Random Process.
Example: If X(t) represents the number of telephone calls received in the interval
(0,t) the {X(t)} is a discrete random process since S = {0,1,2,3, . . . }
Discrete Random Sequence:
A random process in which both the random variable and time are discrete is called
Discrete Random Sequence.
Example: If Xn represents the outcome of the nth toss of a fair die, the {Xn : n1} is a
discrete random sequence. Since T = {1,2,3, . . . } and S = {1,2,3,4,5,6}

3) Condition for Stationary Process: | | ( ) Constant E X t = , | | ( ) constant Var X t = .
If the process is not stationary then it is called evolutionary.

4) Wide Sense Stationary (or) Weak Sense Stationary (or) Covariance Stationary:
A random process is said to be WSS or Covariance Stationary if it satisfies the
following conditions.
i) The mean of the process is constant (i.e) ( ) ( ) constant E X t = .
ii) Auto correlation function depends only on t (i.e)
| | ( ) ( ). ( )
XX
R E X t X t t t = +
5) Time average:
The time average of a random process { } ( ) X t is defined as
1
( )
2
T
T
T
X X t dt
T

=
}
.
If the interval is ( ) 0,T , then the time average is
0
1
( )
T
T
X X t dt
T
=
}
.
6) Ergodic Process:
A random process { } ( ) X t is called ergodic if all its ensemble averages are
interchangeable with the corresponding time average
T
X .
7) Mean ergodic:
Let { } ( ) X t be a random process with mean | | ( ) E X t = and time average
T
X ,
then { } ( ) X t is said to be mean ergodic if
T
X as T (i.e)
| | ( )
T
T
E X t Lt X

= .
Note:
( )
var 0
T
T
Lt X

= (by mean ergodic theorem)


8) Correlation ergodic process:
The stationary process { } ( ) X t is said to be correlation ergodic if the process
{ } ( ) Y t is mean ergodic where ( ) ( ) ( ) Y t X t X t t = + . (i.e) ( ) ( )
T
T
E Y t Lt Y

= .
Where
T
Y is the time average of ( ) Y t .
9) Auto covariance function:
( ) ( ) ( ) ( ) ( ) ( )
XX XX
C R E X t E X t t t t = +
10) Mean and variance of time average:
Mean:
| |
0
1
( )
T
T
E X E X t dt
T
( =
}

Variance:
2
2
1
( ) ( )
2
T
T XX XX
T
Var X R C d
T
t t t

( =
}




11) Markov process:
A random process in which the future value depends only on the present value
and not on the past values, is called a markov process. It is symbolically
represented by
1 1 1 1 0 0
( ) / ( ) , ( ) ... ( )
n n n n n n
P X t x X t x X t x X t x
+ +
s = = = (


1 1
( ) / ( )
n n n n
P X t x X t x
+ +
= s = (


Where
0 1 2 1
...
n n
t t t t t
+
s s s s s
12) Markov Chain:
If for all n,
1 1 2 2 0 0
/ , , ...
n n n n n n
P X a X a X a X a

= = = = (

1 1
/
n n n n
P X a X a

= = = (

then the process { }
n
X , 0,1, 2, ... n= is called the
markov chain. Where
0 1 2
, , , ... , ...
n
a a a a are called the states of the markov chain.
13) Transition Probability Matrix (tpm):
When the Markov Chain is homogenous, the one step transition probability is
denoted by Pij. The matrix P = {Pij} is called transition probability matrix.
14) Chapman Kolmogorov theorem:
If P is the tpm of a homogeneous Markov chain, then the n step tpm P
(n)
is
equal to P
n
. (i.e)
( )
n
n
ij ij
P P ( =

.
15) Markov Chain property: If ( )
1 2 3
, , H = H H H , then P H = H and
1 2 3
1 H + H + H = .
16) Poisson process:
If ( ) X t represents the number of occurrences of a certain event in (0, ) t ,then
the discrete random process { } ( ) X t is called the Poisson process, provided the
following postulates are satisfied.

(i) | | ( ) 1 occurrence in ( , ) P t t t t O t + A = A + A
(ii) | | ( ) 0 occurrence in ( , ) 1 P t t t t O t + A = A + A
(iii) | | ( ) 2 or more occurrences in ( , ) P t t t O t + A = A
(iv) ( ) X t is independent of the number of occurrences of the event in any
interval.
17) Probability law of Poisson process: { }
( )
( ) , 0,1, 2, ...
!
x
t
e t
P X t x x
x

= = =
Mean | | ( ) E X t t = ,
2 2 2
( ) E X t t t ( = +

, | | ( ) Var X t t = .

UNIT-IV (CORRELATION AND SPECTRAL DENSITY)

( )
XX
R t - Auto correlation function
( )
XX
S e - Power spectral density (or) Spectral density
( )
XY
R t - Cross correlation function
( )
XY
S e - Cross power spectral density
1) Auto correlation to Power spectral density (spectral density):
( ) ( )
i
XX XX
S R e d
et
e t t

=
}

2) Power spectral density to Auto correlation:
( ) ( )
1
2
i
XX XX
R S e d
et
t e e
t

=
}

3) Condition for ( ) X t and ( ) X t t + are uncorrelated random process is
| | | | ( ) ( ) ( ) ( ) 0
XX XX
C R E X t E X t t t t = + =
4) Cross power spectrum to Cross correlation:
( ) ( )
1
2
i
XY XY
R S e d
et
t e e
t

=
}

5) General formula:
i) ( )
2 2
cos cos sin
ax
ax
e
e bx dx a bx b bx
a b
= +
+
}

ii) ( )
2 2
sin sin cos
ax
ax
e
e bx dx a bx b bx
a b
=
+
}

iii)
2
2
2
2 4
a a
x ax x
| |
+ = +
|
\ .

iv) sin
2
i i
e e
i
u u
u

=
v) cos
2
i i
e e
u u
u

+
=




UNIT-V (LINEAR SYSTEMS WITH RANDOM INPUTS)

1) Linear system:
f is called a linear system if it satisfies
( ) ( )
1 1 2 2 1 1 2 2
( ) ( ) ( ) ( ) f a X t a X t a f X t a f X t = (


2) Time invariant system:
Let ( ) ( ) ( ) Y t f X t = . If ( ) ( ) ( ) Y t h f X t h + = + then f is called a time
invariant system.
3) Relation between input ( ) X t and output ( ) Y t :
( ) ( ) ( ) Y t h u X t u du

=
}

Where ( ) h u system weighting function.
4) Relation between power spectrum of ( ) X t and output ( ) Y t :
2
( ) ( ) ( )
YY XX
S S H e e e =
If ( ) H e is not given use the following formula ( ) ( )
j t
H e h t dt
e
e

=
}

5) Contour integral:
2 2
imx
ma
e
e
a x a
t

=
+
}
(One of the result)
6)
1
2 2
1
2
a
e
F
a a
t
e


=
`
+
)
(from the Fourier transform)

---- All the Best ----

You might also like