Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
32 views18 pages

Bivariate Discrete Probability

Chapter 8 discusses Bivariate Discrete Probability Distributions, focusing on the concepts introduced by Jerzy Neyman, including joint, marginal, and conditional probability distributions of two discrete random variables. The chapter outlines the definitions and calculations necessary for understanding the relationships between two random variables, including their independence. It provides illustrative examples to demonstrate the application of these concepts in statistical analysis.

Uploaded by

ap8011184
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views18 pages

Bivariate Discrete Probability

Chapter 8 discusses Bivariate Discrete Probability Distributions, focusing on the concepts introduced by Jerzy Neyman, including joint, marginal, and conditional probability distributions of two discrete random variables. The chapter outlines the definitions and calculations necessary for understanding the relationships between two random variables, including their independence. It provides illustrative examples to demonstrate the application of these concepts in statistical analysis.

Uploaded by

ap8011184
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 18

Chapter 8..

Bivariate Discrete Probability


Distributions
Jerzy Neyman (1894-1981) was a Polish
mathematician and statistician who spent most of
NiS professional career at the University of
Calomia. Berkeley. Neyman was the first to
nroguce the moaen concept of a confidence
intenval into statistical hypothesis testing. Another
noted contribution is the Neyman-Pearson lemma,
he basis of hypothesis testing.

Jerzy Neyman
Contents .
8.1 Introduction
82 Bivariate (Joint) Probability Distribution
83 Marginal Probability Distribution
8.4 Independence of Two Discrete Random Variables
8.5 Conditional Probability Distribution
8.6 Joint Distribution Function of Two Dimensional Discrete r.v.
B7 Probability Distribution of (X + Y) when Xand Yare Independent
Key Words :
Joint p.m.f., Marginal p.m.f.. Conditional p.m.f., Joint c.d.f.
Objectives :
Understand the need of bivariate random variables.
Compute probabilities of different events, marginal probabilities, conditional
probabilities etc.
Understand how conditional distribution of X given Y as univariate probability
distribution.
Understand independence of random variables through the joint p.m.f.
(8.1)
8.2 Bivariate Discrete Probability
F.Y.B.Sc. Statistics (Paper- )

8.1.Introduction
Distribution
we have learnt what is meant by a discrete random variable its probbility
So far and expectation. Throughout, we assumed that a single characteristie say X, is of
distribution
interest. For example, X was the number on the face of a die, number of heads when 3 coins
Hence, the set-up was ni
tossed, sum of two faces when two dice thrown etc.
situations when we are interested i
concerning a single r.v. However, there are many
item. That is, there an
characteristics, say X and Y at the same time related to the same
Y are two r.v.s. defined o
r.v.s. to be observed simultaneously. In other words, X and
same sample space 2. Following examples illustrate this point.
1. In a family planning survey, number of children (X), as well as number of girls (Y
in a family are recorded.
2 Sex of the person (X) as wvell as whether he is smoker or not (Y) is recorded.
3. In forestry, diseased plants from two species are identified.
X = Iif the plant is diseased and X =0 otherwise.
Y = lif the plant is of species I andY=0 otherwise.
4 X may be the sum of two numbers and Y may be the maximum of iwo numbers
when two dice are thrown.
5 X may be the sale in kg of tea of Brand A while Y may be the sale of tea of Brand B.
These situations demand handling two variables at a time. Hence, we combine them into
an ordergd pair say (X, ) and call it as a bivariate random variable or two dimensional rx.
Note that both X and Yare defined on the same sample space S2. Whenever both Xand Yare
discrete (finite in this course), we say that (X, Y) is a two-dimensional discrete r.v.
Definition 1 : Bivariate (two dimensional) discrete r.v. : Let 2 be the sample
corresponding to a random experiment. Let X and Y be two real valued functions (rv.s)
defined on 2. The ordered pair (X, Y) defined by (X, Y) (w) = (X (w), Y (w) is calleda
bivariate or two dimensional random variable on S2. If the number of possible values o
(X. Y) is finite, then (X, Y) is called as a bivariate discrete r.v.
Note that (X, Y) is discrete if and only if Xand Yare both discrete. Hence, if Xtakes
values x,, xXz, ... Xm and Ytakes values y,. yz, .. yn: then the range space of (X, Y) is
R«, y) = {(xi, y): i=1,2, ..., m; j= 1, ..., n)
and Y
R(x, y) is nothing but the cartesian product of R, and Ry. the ranges of X
respectively. R(x. y) contains m Xn points.
8.2 Bivariate (Joint) Probability Distribution
Recall that the p.m.f. P(x) of a single r.v. X, was derived by probabilitiesof
elements in 2 which gives rise to the
adding the canconstruct
the event |X= x]. On similar lines one
the joint probability mass function of the bivariate r.v. (X yy
VESC Statistics (Paper- 1) 8.3 Bivariate Discrete Probability Distributions
ninition 2: Joint Probability Mass Function of (X, Y) : Let (X. Y) be a disctete
anate defined on 2. Let the range space of (X, Y) be
Rx.y) = {(x y): i= 1, .... n:j= 1,2. ....n)
Fr each(X, y) e Ra,y) we define a function.
P(si. y) = Pij as follows :
Pj = P(Xi. yj) =P[X = X, Y =yl: i=1,2. .... m: j=1.2. ...n
if. ) P 2 0 for all i,j

() i=l j=1 P,=1.


then the abovefunction Pis called as the joint probability mass fiunction of (X. Y).
The set ((X;. yj- Pi): i= 1, 2. ... m: j= 1, 2, .... n) is called the joint probabiliry
disribution of (X, Y); which can be represented in the following tabular form.
P[X =X, Y = yl is obtained by adding the probabilities of those elements in which
give rise to theevent [X = x]oY = yjl.
Table 8.I

Y Total
X
P Pi2 Pi Pin P
Pz P2: Po Pzn P

Pi Piz Pu Pun

Pal Pa Pm Pa
Total P P P 1
The row otals P.. P..Pn represent marginal probabilities of X while the column
Parepresent marginal probabilities of Y,
8.3 Marginal Probability Distributions
et (s y Dul: i 1, sr B j ) Fepesent a joint probabiliry distributioo of a
-dimeasional discerete t.N. (X, Y) We kaow ha by py we mean
e Can obtainthe (univariate) pobability P[XYyl Using
distribuion of X as followS
PPOXx)
Bivariate Discrete e Probability
F.Y.B.SC. Statistics (Paper -
I)
n
8.4
Distributions
j=l

= PU I(X= x)nY= y)) sing the distributive property


Lj=1
= PU (X= X;.
Lj=l
Y
PX=N Y =y)
j=1
and different y; are disjoint.
Since, the events (X = X, Y= y) for fixed x;
= Pi: i=1,2,....n
j=l
marginal probabilin
The above obtained probability distribution is called as the
distribution of X. It can be presented in the following tabular form.
X XË Xm Total

P(X=x) P. P. P; Pm. 1

On similar lines we can derive the marginal probability distribution of Y.


P.j = P(Y= y)
m

= PU
Li=1 X=x Y=y)
=

i=l
Pij : j=1,2, ...n
Y
Total
P(Y=) P P.:
Pj P. n

Obviously. P,. = P.j=l


j=1
Thus. the row totals in Table 8.l give the
while the column totals give the probabilities P (X= ) for various values e
probabilities P(Y = y) for various values of y
llustrative Examples
Example 8.1L:Two fair dice are thrown. Let Xdenote the absolute difference between
the two scores and Y denote the maximum of
(wO scores.
() Derive the joint probability
(ii) Also obtain the marginal distribution
of (X, Y).
probability distributions
of Xand Y.
I)
EYB.SC. Statistics (Paper - 8.5 Bivariate Discrete Probability Distributions
1
solution: We know that 2 contains 36 elements each with probability 36 The
possible values of Xare 0, 1, 2, 3, 4, 5 and those of Yare 1, 2, 3, 4. 5. 6. Now, consider
PX= 0, Y= ). It means that the difference between two numbers is 0 and the maximum
between two numbers Is 1. This can happen only when (1, 1) occurs. That is, when both the
faces show 1.

P(X =0, Y= ) = 36
Similarly. P(X=0, Y= 2) = P(2,2) = 6

P(OX= 0, Y = 6)= P(6, 6) = 6


Further, P (X = 1, Y= I) = Probability that the difference between two numbers is 1,
while maxinmumis 1. This is an impossible event, as no face of the die is marked as 0.
P(X= 1, Y= 1) = 0
Now. P(X=1, Y=2) = P[(L, 2) U(2, )] = Pl,2) +P(2, 1) = 36
2
Similarly, P(X= 1, Y =3) = ,.. P(X= 1, Y=6) =
Continuing the above arguments on similar lines, we see that following is the joint
probability distribution of (X, Y).
1 2 3 4 5 6 Marginal
X distribution
of X
0 6
36 36 36 36
36 36
1 2 10
0 2
36 36 36 36
2 0 2 2 2 8
36 36 36
3 0 0 0 2 6
36 36
4 0 0 0 2 2 4
36 36
5 0 0 0 2 2
36 36

Marginal
distribution 36
3
36
5
36
7
36 36 36
of Y
Bivavtate Diserete Pobabiliy Dlstviui

10
TNmial txrbuono\is give by

36

hample 2 ho as are seleced at random without replacement from a bowi


aib n and black teads, Let Ndenote the number of red beads drawa
\deeee nbr o blactboad drawn :(o) ind the joint p.mt of X, Y),(G) Obtain the
Saition (O dlevalues of Nare 0,Ias there is asingle red bead in the bowl, while
N e ales oY are 1, 2.

4blue Ied 2 black


Now,P0Y0)obabiity that neither a red nor a black bead is selected ic.

ak are blue. Thereore, P(X0, Y 0)


PN Y= ) Prohability that no red bead is selected and I black bead is selected.
This meansselection of lblack and I blue bead.

PX0Y=) 21

Continuing the logic. we get.

PN=0.Y=)

POX=1,Y-0)

PX-LYe)
KYRSC Statistics (Paper- I) 8.7 Bivariate Discrete Probability Distributions
PX= 1, Y= 2) =0 as only two beads are selected. Writing these probabilities in the
we get.
Rtt of atable,
1 2 Total

0
8
21
1 4 6
21 2
21 21
Total 10
21 21 21

@ Hence. the marginal probability distributions are


X 0

P(x) 15
21
621
0 1 2

P(y) 1
21 21
(i) The event (X <)refers to the sample
points (0, 1). (0. 2). (1. 2).
P(X<Y) = PO.1) +P(0. 2) +P(1.2)

Independence of Two Discrete Random Variables


B4
defined the independence of two events. Events A and B are
Is carlier chapter. we Extending this definition to two discrete
dependent if and only if P(AnB) =PA)" P(B).
ifand only if the events [X = x] and
AX and Y.we sav that X and Y are independent definition in terms of p.m.fs is as
X and Y. The
Ey] are independent for all values of
follows
defined on a sample space 2 Let p
Definition3 : Let (X. Y) be a discrete bivariate r.v. the joint p.m.f. of (X. Y, Let P.
Y= y): i= l, ... m:j= 1. n denote
m and P. j=1.2, ...,.n be the marginal p.mf. s of Xand Yrespectively Then
2
Aand Yare called independent r.V.s. if and only tt.
P P xP i=l, m:j= 1. ....n.
In other words.
POX= s) P(Y= y) for all i and j.)
F.Y,B.Sc. Statistics (Paper - ) 8.8 Bivariate Discrete

Remark : In order to ascertain independence of twO r.v.s., it is essential


Probability Distritbut
ever entry in the table of joint probability distribution is equal to 10 Show
coresponding row and column totals. On the other hand, if P P XP., for the produc
Pi, the r.v.s. are not independent. That is, they are dependent. even a sing
Example 8.3 : Let (X, Y) be a discrete bivariate r.v. with the following joint ne
0 1 2 3

X
3k 2k 4k
1 2k 6k 4k Sk
3k 9k 6k 12k
(i) Findk.
(ii) Are X and Y independent ?
(ii)Calculate : (a) P(X2, Y 21)
(b) P(X=Y)
(c) P(X+YSl)
(d) P(X?+ Y²s4)
Solution : () The condition ) Pi= limplies
60k = 1
k
60
(i)) The joint p.m.f. as well as the marginal p.m.f.s are as
follows:
Y 1 2 3
X
P
1 3 2 4 10
60 60 60 60 60
1 2 6 4 8
60 60 60
20
60 60
2 3 9 6
60 60 60
12 30
60 60
P.j 6
18 12
60 24
60 60 60
P(X=0, Y=0) = 10 6
60 60 X60 =P (X =0) P(Y=0)
P(X=0, Y=1) = 3 1018 6060 = P(X= 0) P(Y= 1)
Verify that pij =Pi-P-j holds for all i andj in this
table.
EYSSC Statistics (Paper- ) 8.9
Bivariate Discrete Probability Distributions
Hence, Xand Yare independent.
(üi) (a) PXS2. Y 2 1) = P(XS2) - P(Y21)
= 1..54
60 = 0.9 Xand Y are independent
(b) P(X= Y) = P{(0.0) u(1, 1)U(2.2))
13
60

(c) P(X+ Ysl) = P[(0, 0) 0. 1) o(1, 0)) = 0 =0.1


(d) P(X+ Y?<4) = P (0.0) (0. 1) U(0. 2) U(1.0) U(1. ) u(2.0))
17
=60
8.5 Conditional Probability Distributions
The conditional probability of event A given event B is defined as

P (AIB) = P(AnB)
P (B) > 0
P (B)
Similarly, the conditional probability of (X = x) given that (Y= y) is defined as,

P(X= x;|Y= y) = P(X=X, Y=y) Pij.


P(Y=y) P.
When Yis fixed at yj, and we obtain conditional probabilities of each of x,, i = 1, ...,m
gven the event (Y = yj). what we get is the conditional probability distribution of Xgiven
Y=y. It is represented as follows :
X X X X; m Total
P (X= x;|Y= y) Pi P. Pmj 1
P.j P.j P.
P.j
When we fix X at a particular value xË, say and obtain the conditional probabilities of
TEy for all i. given X = Xi, then we get the conditional probability distribution of Ygiven
X= X
P(Y = y;l|X= x;) =
P(X= Xi, Y= y)
P(X= Xi)
-

It is tabulated as follows.
Y
y Yn Total
P(Y= yi|X= x) Pi Piz Pi Pin
P. P. P P
Discrete Probability
Bivariate
F.Y.B.Sc. Statistics (Paper -
)
we
8.10

obtain conditional distribution of X given Distributiens


Remarkl: Notice that when y. Similarly, in
and not Y. In fact Y is hold constant at case o
underlying variable is X the variable is Y and not X. In fact Xis
constant at distribution
conditional given X=
of Yconditional
x,. Accordingly,
X, hold
probability distributions are univariate probabilny

distributions.
}P
11
Pi.
Remark 2: i=l PX=N|Y=y) = } P = P.
P.j =1
P.
n
Pij P. =1
and P(Y= y; |X = x) = P.
P;.
=

j=1 j=1
These conditions have to be fulfilled as these are p.m.f.s.
Remark 3: When X and Y are independent, the conditional probability distributi
become nothing but the respective marginal distributions. For,
P(X= Xi Y=y)
P (X= x|Y= y) = P(Y= y)
P (X = x) P (Y = y)
P(Y=yj)
P;Pi
P.j Pi for i= 1,...n

Pi P;. P.j
Also, P(Y= y;lX=X) = PË. Pi
= P. j for j = 1, 2. ...
This is in consistency with P (AIB) = P (A) when A and B are
independent.
Remark 4 : For obtaining conditional probability distribution of X given Y = y;
the column of Y = y;and to each entry in this column divide by its
column total. Similari
get the conditional probability distribution of Y given X = Xi,
value in this row, divide by the row total. mark the jth row and te

llustrative Examples
Example 8.4: Following is the joint probability
Y 1 2
distribution of (X, Y).
3 4
X

24 12
1 24
12 6
2 6 12
24
12 12 24
) Bivariate Discrete Probability Distributions
eVASC. Statistics (Paper- 8.11

Obtain the conditional probability distribution of Xgiven Y2.


) Obtain the conditional probability distribution of YgivenX1
) Caleulate P (XSI|Y= )
iv) Caleulate P (Y23|X2)
Find the probability distribution of X+ Y.
N) Find the probability distritbution of XY.
pi) Find the probability distribution of 2X -Y+ 1.
Solution : (i) We want conditienal probability distribution of Xgiven Y=2.
P(X=IY=2) = P(X=X,. Y= 2) P.,
As per Remark 4. consider the column of Y = 2 which is.
Y =2
1/12 Hence, the conditional probabilities are
1/6 0 2
1/12 P(X]Y = 2) | 1
4

Total 1/3
() We want conditicnal probability distribution of Ygiven X = 1.
The row of X = 1is
Total
X=1
6
Hence,the conditional probability distribution of Ygiven X= l is as follows.
Y 2 3 4

P(Y|X= 1)1
3 6
(i) In order to calculate P (XSI|Y= ) we first obtain the conditional
dstibution of X eiven Y = l which is as given below. probability
0
P(X|Y= 1)
4

Hence, PX<2|Y=) = PX=0|Y= )+P(X=|Y )

POYa3|Xzl) PX1Y2 3)
P(X21)
PX21)
Bivariate Discrete Probability
F.Y,B.Sc. Statistics (Paper- )
The marginal p.m.f. of X is
8,12

2
Distributing
(0

P ()

P(X2D=4
3/8
P(Y23|X2) = 3/4 2
evaluates the function of r.y, Xands.
To answer (v), (vi), (vii) we prepare a table which
X+ Y, V= XY and W2X-V
We are rearrange the values and write its p.m.f. Let U=
The pairs of (X, Y) we write in a column to evaluate U, V, W.
P(x, y) U=X+Y V= XY W=2X -Y+1
0 1/24 (0

1/12 2
4 1/12 3 -2
5 1/24 4 3
1/12 2 2
2 1/6 3 2
3 1/6 4 3
1/12 4
2 1/24 3 4
2 1/12 4 4 3
3 1/12 5 6 2
4 1/24 6 8 1
Total

U= X+Y 2 3 4 5 6
P(U) Total
1/24 4/24 7/24 7/24 4/24 1/24
V= XY 0 2 3 4
P(V) 6/24 2/24
6 8 Total
5/24 4/24 4/24 2/24 1/24
W= 2X -Y+1 -3 -2
2
Total
P(W) 1/24 2/24 4/24 3 4
5/24
Example 8.5 : The joint p.m.f. of (X, Y) is given 5/24
by.
4/24 2/24 1/24
P(X. y) =
C(x'+y'): C> 0
X 1, 1
0 y 2, 2
otherwise
)
LB.SC. Statistics (Paper - 8.13 Bivariate Discrete Probability Distributions
Obtain (i) C. () the marginal p.m.f.s. of X and Y, (ii) The conditional p.m.f. of Xgiven
-2.(iv) Ae Xand Yindependent ?
Solution :() To find C, consider P(x, y) = 1
Xy
2
P(x, y) = C(20) =1 C= 0
x=-1 y=-2
G) The marginal p.m.f. of Xis given by,
2
P (x) = y=-2
P(x, y)

(x?+4 + x'+4)

for x = -1, 1
10
+1

The marginal p.m.f. of Yis, P() = X=-1


P(x, y)
= 20 (h+ y'+ 1+ y)
1+y² for y = -2, 2
10

(ii) The conditional p.m.f. of X given Y = 2 is,


P(x y=2)
P (xy = 2) = P (y÷2)
C(x?+4)
(1+4)/10
X²+4
10 for x = -1,1
(v) Observe that the ioint p.m.f. of (X, Y) can be tabulated as follows:
Y -2 P (x)
X
-1
4 4
+1
4 4

P(y)
2
Clearly Xand Y are independent since
Px. y) P()P(y) for all(x, y).
Bivariate Discrete Probability
F.Y.B.Sc. Statistics (Paper - I) 8.14
Distributions
Joint Distribution
B.6. Definition Function of Two Dimensional Discrete R
4 : Let (X, Y) be a bivariate discrete r.v. defined on 2. The joint (cumulative)
distribution function F (x. y) is defined as,
yl, X, ye R
F(x. y) = P[X<x, Y< distribution function.
the important properties of the
We state below some of
1. 0<F(x. y)s1
function is non-decrèasing in each of the variables. i.e.
2. The
(i) F(«. y) sF (x. y) ify <y
ifx <x*
(ii) F(x. y) sF (x. y)
lim
3 x’- F(x, y) = 0.

lim
4. F(x, y) =l
5 Let a, b, c, d be any real numbers with a<b, c<d. Then,
c) -F (a, d)
P [a< Xsb, c< Ys d] = F(b, d) + F(a. c) - F(b,
Y).
Example 8.6 : Following is the joint probability distribution of (X,
1 2 3
X
0
24 12 12 D4
1
12 6
2
24 12 12
Calculate () F(1, 1), (ii) F(1, 3). (ii) F (0.5, 2.5).
Solution : The joint probability dis1ribution of (X, Y) in xarnple 8.4 is as follows.
Y 1 2 4
X
0 1
24 12
1 1 1
12 6 6
2
24 24
FU. I). = P(XS1, Ysl)
= P(0, 1) u0.1) =54 +n -
Distributions
TASeStatlsthe (Paper-) 8.15 Bivarlate Discrete Probability
FO,3) e P(XSI, Ys3)
P|0, Do0,2) U0, 3) U1, 1)U0. 2) O0, S))
I5 5
248
()
F0.5, 2.5)=P(X0.5, Ys2.5)
P(XS0, Ys2)

are Independent
7Probablity Distributlon of (X+ Y) wheny),X&Y i = 1, 2, ., m;j=1, 2, ., n).
Y) be a bivariate r.v, with range space as ((x,,
Let (X, marginal p.m.fs of Xand Yrespectively.
P, (X) and P, (y) denote the
Dline Z X+ Y. Let
Consider
P(Z=z| PIX = X, Y =z - xil

:Xand Yare independent


P[X =x: P[Y =z- xil

P,(x) P,(2-x)

Points to Remember
compute the
bivariate probabilities on the values of one variable, one can
Summing
variable.
marginal p.m.f. of the other marginal probability of the
joint probability divided by the
Conditional p.m.f. is
given variable.
independent if and only if the joint p.m.f. is the product of
Two variables are variables.
individual marginal p.m.fs of the two
" POX=X|Y =y) =PP
P(Y =y|X= X) =
= X,, Y=y) =Pj =P, Xp, V(i, j).
Xand Yare independent random variables if PX
Exercise 8 (A)|
Theory Questions :
r.v.s, is needed.
Give some practical situations where use of bivariate
Z Define :(i) a bivariate discrete r.V,
) Range space of a bivariate discrete r.v.
)Joint probability mass function of (X, Y).
FY.B.Sc. Statistics (Paper- ) 8,16 Bivariate Discrete Probability Distributione
ENplan what is meant by marginal probability distribution of X and Y.
4 When are two discrete r.v.s,said to be independent ?
5. Explain what you understand by
() Conditional distribution of Xgiven Y=yj
(i) Conditional distribution of Ygiven X= XË.
given
Show that the conditional p.m.f. of Xgiven Y =y; and conditional p.m.f. ofY
X= satisfy the conditions of probability mass function.
distribution of X given
7 Show that when X and Y are independent, the conditional
Y=y. is the marginal distribution of X.
r.v. Also state jits
8, Define the joint distribution function of a two dimensional discrete
important properties. and Y as P, (x) and
Let (X, Y) be a discrete bivariate r.v. with marginal p.m.f.s of X
Y.
P; (y) respectively. Obtain the expression for the p.m.f. of Z=X+
Exercise 8 (B)
Numerical Problems :
of heads and Y denote
10. Suppose a fair coin is tossed thrice. Let X denote the number
the number of tails. Obtain the joint probability distribution of (X, Y). Also obtain
the marginal distributions of X and Y.
when
11. Let X and Y denote the sum and difference (absolute) of scores obtained
two fair dice are thrown. Obtain the joint p.m.f. of X and Y. Hence, calculate
() P(X = ), (ii) P(X-Y=2), (iii) P (X = 3).
12. Two fruits are to be selected at random from 4 mangoes, 2 oranges and 3 apples. Let
X and Y denote respectively the mangoes and oranges selected. Obtain the joint
probability distribution of (X, Y).
13. Let X and Y be two discrete r.v.s. having joint p.m.f.
P (x. y) = kxy X = 1,2, 3
y = 1, 2, 3
= 0 otherwise.
Find (i) k. (i) PX+ Y> 5). (ii) Are Xand Yindependent ? Justify.
14. Ar.v. (X, Y) has joint p.m.f. as follows.
Y 0 2

0.1 0.2 0.3


0.1 0.1 0.2
Find : (i) Marginal distributions of X and Y.
G) Conditional probability distribution of XgivenY= 1.
15. Suppose two cards are drawn at randonm without replacement fron a set of thee
cards numbered 0, 1, 2. 1T X denotes number on the first cad drawn and Y denotes
number on the second card, obtain ) the joint p.m.f. of (X Y) (i) PXY>0h
(iii) P(X+ Y's ).
eK&SC Statistics
(Paper - 8.17 Bivariate Discrete Probability Distributions
E The joint p.m.t. of A, Y is as follows :
Y -2 2
X
-1 0.1 0.2 0.1
0.2 0.1 0.1
1 0.1 0.1 0.0
Find: (0) The conditional probabilitydistribution of Xwhen Y=0.
(ii) POX+Y<2) (iii) P(X*+ Y'<3)
(iv) P(Y =2 |X= 1) (v) PX+Y<l)
(vi) F(0. 0) (vii) F (0. 2).
12 Let (X, Y) be a bivariate r.V. with the following joint p.m.f.
(X. Y) (0. - 1) (0. 1) (1,- 1) (1. 1)
P(X, Y) 225 3/25 825 12/25
6) Compute the marginal p.m.f.s. of X and Y.
(i) Are X and Y independent ?
(ii) Compute P(X +Y<I)
(iV) Find the conditional probability distribution of Y given X=0.
18. The joint p.m.f. of (X. Y) is
P(x, y) = 3y X = 0. 1.2

y = 1.2, 3
= 0 otherwise.
Y given X = 1.
Find i) Marginal p.m.f. of X. (iü)Conditional p.m.f. of (April 2012)
19. Let the joint p.m.f. of (X,. X) be.
k > 0
P(x, x) = k(2x, +5x,)
X = 1, 2
X = 1, 2
= 0 otherwise
conditional distribution
Determine (i)k. (ii) the marginal p.m.f.s. of X, and X,. (iüi)
of X,given X, = 2.
below.
Ihe joint p.m.f. of (X, Y) is given 2
Y -1 0
X
-2 1/9 1/27 1/27 1/9
1 2/9 1/9 I/9
3 1/9 4/27
Compute (i) P OX> 0), (ii) PX<0, Y >0), (iii) P (Xis odd), (iv) P(Y is odd and
positive). (v) The conditional distribution of X when Y = 1.
F.Y.B.Sc. Statistics (Paper - ) 8.18
Bivariate Discrete Probability Distributione

21. Following are the marginal p.m.fs of X and Y. 3


Y
2 3 0.6 0.3
P (x) 0.3 0.3 P() 0.
0.4
obtain the joint probabilily distribuse of X
Assuming X and Y to be independent,
and Y.
independent r.v.S, with a joint p.m.f. partly shown in tnc iollowinn
22. X and Y are
table. Fill the gaps.
0 Total

0
6
1

Total

Exercise 8 (C)|
Y be
23. Let the joint p.m.f. of two discrete r.v.s. X and
n
P p p-y
P(x, y) = x!y! (n -x-
h:0s (x + y)Zn:
where x, y and nare non-negative integers with 0<x<n; 0y
0<p- P2, Ps< l with p, + p, +P, = 1.
Find: (i) marginal p.m.f.s of X and Y.
(iü) conditional p.m.f. of X given Y = y.
(ii) conditional p.m.f. of Y given X = x.
24. Following is the joint p.m.f. of (X, Y).
Y 0 2 3
X
0.10 0.05 0 0.20
1 0.15 0.25 0.05 0.05
2 0.05 0.10 0 0
Find (i) the probability distribution of Z=X+Y. (i)) P (Z=2),
(ii) P(Z=1|X=), (iv). P(XsY).
25. Let (X, Y) be a bivariate discrete r.v. with joint p.m.f.
5

P(x, y) = :X= 0, ... 10

i y = 0, ....4
iX+ys4
Find the marginal p.m.f. of X and Y.

You might also like