Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
89 views4 pages

Sub BAB 4.1

Uploaded by

Willem Louis
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
89 views4 pages

Sub BAB 4.1

Uploaded by

Willem Louis
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

CHAPTER 4

Random variables and expectation

4.1 Random variables


When a random experiment is performed, we are often not interested in all of
the details of the experimental result but only in the value of some numerical
quantity determined by the result. For instance, in tossing dice we are often
interested in the sum of the two dice and are not really concerned about the
values of the individual dice. That is, we may be interested in knowing that the
sum is 7 and not be concerned over whether the actual outcome was (1, 6)
or (2, 5) or (3, 4) or (4, 3) or (5, 2) or (6, 1). Also, a civil engineer may not
be directly concerned with the daily risings and declines of the water level of
a reservoir (which we can take as the experimental result) but may only care
about the level at the end of a rainy season. These quantities of interest that are
determined by the result of the experiment are known as random variables.
Since the value of a random variable is determined by the outcome of the ex-
periment, we may assign probabilities of its possible values.

Example 4.1.a. Letting X denote the random variable that is defined as the
sum of two fair dice, then

P {X = 2} = P {(1, 1)} = 1
36 (4.1.1)
P {X = 3} = P {(1, 2), (2, 1)} = 2
36
P {X = 4} = P {(1, 3), (2, 2), (3, 1)} = 3
36
P {X = 5} = P {(1, 4), (2, 3), (3, 2), (4, 1)} = 4
36
P {X = 6} = P {(1, 5), (2, 4), (3, 3), (4, 2), (5, 1)} = 5
36
P {X = 7} = P {(1, 6), (2, 5), (3, 4), (4, 3), (5, 2), (6, 1)} = 6
36
P {X = 8} = P {(2, 6), (3, 5), (4, 4), (5, 3), (6, 2)} = 5
36
P {X = 9} = P {(3, 6), (4, 5), (5, 4), (6, 3)} = 4
36

P {X = 10} = P {(4, 6), (5, 5), (6, 4)} = 3


36

P {X = 11} = P {(5, 6), (6, 5)} = 2


36

P {X = 12} = P {(6, 6)} = 1


36 99

Introduction to Probability and Statistics for Engineers and Scientists. https://doi.org/10.1016/B978-0-12-824346-6.00013-2


Copyright © 2021 Elsevier Inc. All rights reserved.
100 CHAP TER 4: Random variables and expectation

In other words, the random variable X can take on any integral value between
2 and 12 and the probability that it takes on each value is given by Equa-
tion (4.1.1). Since X must take on some value, we must have
 12 
 
12
1 = P (S) = P {X = i} = P {X = i}
i=2 i=2

which is easily verified from Equation (4.1.1).


Another random variable of possible interest in this experiment is the value of
the first die. Letting Y denote this random variable, then Y is equally likely to
take on any of the values 1 through 6. That is,

P {Y = i} = 1/6, i = 1, 2, 3, 4, 5, 6 

Example 4.1.b. Suppose that an individual purchases two electronic compo-


nents, each of which may be either defective or acceptable. In addition, suppose
that the four possible results — (d, d), (d, a), (a, d), (a, a) — have respective
probabilities .09, .21, .21, .49 [where (d, d) means that both components are
defective, (d, a) that the first component is defective and the second acceptable,
and so on]. If we let X denote the number of acceptable components obtained
in the purchase, then X is a random variable taking on one of the values 0, 1,
2 with respective probabilities

P {X = 0} = .09
P {X = 1} = .42
P {X = 2} = .49

If we were mainly concerned with whether there was at least one acceptable
component, we could define the random variable I by

1 if X = 1 or 2
I=
0 if X = 0

If A denotes the event that at least one acceptable component is obtained, then
the random variable I is called the indicator random variable for the event A,
since I will equal 1 or 0 depending upon whether A occurs. The probabilities
attached to the possible values of I are

P {I = 1} = .91
P {I = 0} = .09 

In the two foregoing examples, the random variables of interest took on a finite
number of possible values. Random variables whose set of possible values can
4.1 Random variables 101

be written either as a finite sequence x1 , . . . , xn , or as an infinite sequence x1 , . . .


are said to be discrete. For instance, a random variable whose set of possible
values is the set of nonnegative integers is a discrete random variable. However,
there also exist random variables that take on a continuum of possible values.
These are known as continuous random variables. One example is the random
variable denoting the lifetime of a car, when the car’s lifetime is assumed to
take on any value in some interval (a, b).
The cumulative distribution function, or more simply the distribution function, F
of the random variable X is defined for any real number x by

F (x) = P {X ≤ x}

That is, F (x) is the probability that the random variable X takes on a value that
is less than or equal to x.
Notation: We will use the notation X ∼ F to signify that F is the distribution
function of X.
All probability questions about X can be answered in terms of its distribution
function F . For example, suppose we wanted to compute P {a < X ≤ b}. This
can be accomplished by first noting that the event {X ≤ b} can be expressed
as the union of the two mutually exclusive events {X ≤ a} and {a < X ≤ b}.
Therefore, applying Axiom 3, we obtain that

P {X ≤ b} = P {X ≤ a} + P {a < X ≤ b}

or

P {a < X ≤ b} = F (b) − F (a)

Example 4.1.c. Suppose the random variable X has distribution function



0 x ≤0
F (x) =
1 − exp{−x 2 } x>0

What is the probability that X exceeds 1?

Solution. The desired probability is computed as follows:

P {X > 1} = 1 − P {X ≤ 1}
= 1 − F (1)
= e−1
= .368 
102 CHAP TER 4: Random variables and expectation

FIGURE 4.1
Graph of p(x), Example 4.2.a.

4.2 Types of random variables


As was previously mentioned, a random variable whose set of possible values
is a sequence is said to be discrete. For a discrete random variable X, we define
the probability mass function p(a) of X by

p(a) = P {X = a}

The probability mass function p(a) is positive for at most a countable number
of values of a. That is, if X must assume one of the values x1 , x2 , . . ., then

p(xi ) > 0, i = 1, 2, . . .
p(x) = 0, all other values of x

Since X must take on one of the values xi , we have




p(xi ) = 1
i=1

Example 4.2.a. Consider a random variable X that is equal to 1, 2, or 3. If we


know that

p(1) = 1
2 and p(2) = 1
3

then it follows (since p(1) + p(2) + p(3) = 1) that

p(3) = 1
6

A graph of p(x) is presented in Figure 4.1. 

The cumulative distribution function F can be expressed in terms of p(x) by



F (a) = p(x)
all x ≤ a

You might also like