Two Dimensional Random Variables
Two dimensional random variable
Let S be the sample space of a random experiment. Let X and Y be two random
variables defined on S. Then the pair (X,Y) is called a two dimensional random variable.
Discrete bivariate random variable
If both the random variables X and Y are discrete then (X,Y) is called a discrete random
variable.
Joint Probability mass function
Let X take values {x1 , x 2 , …, x n } and Y take values { y1 , y 2 ,… , y m } . Then
p ( xi , y j ) = P ( X = xi , Y = y j ) = P ( X = xi ∩ Y = y j ) . {xi, yj, p(xi,yj)} is called joint
probability mass function.
Marginal Probability Mass Function of X
{xi , p X ( xi )} is called the marginal probability mass function of X where
m
p X ( xi ) = ∑ p ( xi , y j ) .
j =1
Marginal Probability Mass Function of Y
{ y j , pY ( y J )} is called the marginal probability mass function of Y where
n
pY ( y j ) = ∑ p ( xi , y j ) .
i =1
Conditional Probability Mass Function
p ( xi , y j )
• Of X given Y=yj PX / Y ( xi / y j ) = , i = 1,2,… , n
pY ( y j )
p( xi , y j )
• Of Y given X=xi PY / X ( y j / xi ) = , j = 1,2,… , m
p X ( xi )
Independent Random Variables
Two random variables X and Y are said to be independent if
p ( xi , y j ) = p X ( xi ) pY ( y j ) , i = 1,2,… , n ; j = 1,2,… , m
Dr. R. Sujatha / Dr. B. Praba, Maths Dept., SSNCE.
Continuous bivariate random variable
If X and Y are both continuous then (X,Y) is a continuous bivariate random variable.
Joint Probability Density Function
If (X,Y) is a two dimensional continuous random variable such that
dx dx dy dy
Px − ≤ X ≤ x+ ∩y− ≤ Y ≤ y + = f XY ( x, y )dxdy then f(x,y) is called the
2 2 2 2
joint pdf of (X,Y) provided (i) f ( x, y ) ≥ 0, ∀( x, y ) ∈ R XY (ii) ∫∫ f ( x, y )dxdy = 1 .
R XY
Joint Distribution Function
x y
FXY ( x, y ) = P( X ≤ x, Y ≤ y ) = ∫ ∫ f ( x, y )dydx
−∞ − ∞
∂ 2 F ( x, y )
Note: f ( x, y ) =
∂x∂y
Marginal Probability Density Function
∞
• Of X f X ( x) = ∫ f ( x, y )dy
−∞
∞
• Of Y f Y ( y ) = ∫ f ( x, y )dx
−∞
Marginal Probability Distribution Function
x
• Of X FX ( x) = ∫ f X ( x)dx
−∞
y
• Of Y FY ( y ) = ∫ f Y ( y )dy
−∞
Conditional Probability Density Function
f ( x, y )
• Of Y given X = x f Y / X ( y / x) =
f X ( x)
f ( x, y )
• Of X given Y = y f X / Y ( x / y ) =
f Y ( y)
Independent random variables
X and Y are said to be independent if f ( x, y ) = f X ( x) f Y ( y ) i.e, F ( x, y ) = FX ( x) FY ( y ) .
Dr. R. Sujatha / Dr. B. Praba, Maths Dept., SSNCE.
Moments of two dimensional random variable
The (m, n)th moment of a two dimensional random variable (X,Y) is
µ mn = E ( X mY n ) = ∑ ∑ xi m y j n p( xi , y j ) (discrete case)
i j
∞ ∞
= ∫ ∫ x m y n f ( x, y )dxdy (continuous case)
−∞ − ∞
Note : (1) when m=1,n=1 we have E(XY)
(2) when m = 1, n = 0 we have E ( X ) = ∑ ∑ xi p( xi , y j ) = ∑ xi p X ( xi ) (discrete)
i j i
∞ ∞ ∞
= ∫ ∫ xf ( x, y )dxdy = ∫ xf X ( x)dx (continuous)
−∞ − ∞ −∞
(3) when m = 0, n = 1 we have E (Y ) = ∑ ∑ y j p( xi , y j ) = ∑ y j pY ( y j ) (discrete)
i j j
∞ ∞ ∞
= ∫ ∫ yf ( x, y )dxdy = ∫ yf Y ( y )dy (continuous)
−∞ − ∞ −∞
(4) E ( X ) = ∑ xi p X ( xi ) (discrete)
2 2
i
∞
= ∫ x 2 f X ( x)dx (continuous)
−∞
(5) E (Y 2 ) = ∑ y j pY ( y j ) (discrete)
2
j
∞
= ∫ y 2 f Y ( y )dy (continuous)
−∞
Covariance
Cov( X , Y ) = E[( X − X )(Y − Y )] = E ( XY ) − XY
Note: (1) If X and Y are independent then E(XY)=E(X)E(Y). (Multiplication theorem
for expectation). Hence Cov(X,Y)=0.
(2) Var(aX+bY)=a2Var(X)+b2Var(Y)+2abCov(X,Y).
Correlation Coefficient
This is measure of the linear relationship between any two random variables X and Y.
The Karl Pearson’s Correlation Coefficient is
Dr. R. Sujatha / Dr. B. Praba, Maths Dept., SSNCE.
Cov( X , Y )
ρ = r( X ,Y ) =
σ XσY
Note: Correlation coefficient lies between -1 and 1.
Regression lines
σX
The regression line of X on Y: X − X = r (Y − Y )
σY
σ
The regression line of Y on X: Y − Y = r Y ( X − X )
σX
r- Correlation coefficient of X and Y.
Transformation of random variables
Let X, Y be random variables with joint pdf f XY ( x, y ) and let u(x, y) and be v(x, y) be
two continuously differentiable functions. Then U = u(x, y) and V = v(x, y) are random
variables. In other words, the random variables (X, Y) are transformed to random
variables (U, V) by the transformation u = u(x, y) and v = v(x, y).
The joint pdf gUV (u,v ) of the transformed variables U and V is given by
gUV (u , v ) = f XY ( x, y ) J
∂x ∂y
∂ ( x, y ) ∂u ∂u i.e., it is the modulus value of the Jacobian of
where |J| = =
∂ (u, v ) ∂x ∂y
∂v ∂v
transformation and f XY ( x, y ) is expressed in terms of U and V.
Dr. R. Sujatha / Dr. B. Praba, Maths Dept., SSNCE.