Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
67 views12 pages

CH02

Chapter 2 discusses various probability concepts, including outcomes, events, and the application of the Law of Total Probability and Bayes' Rule. It covers examples involving defective components, test results, and password security probabilities. The chapter emphasizes the calculation of probabilities for different scenarios using given data and independence assumptions.

Uploaded by

ahmad nadeem
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
67 views12 pages

CH02

Chapter 2 discusses various probability concepts, including outcomes, events, and the application of the Law of Total Probability and Bayes' Rule. It covers examples involving defective components, test results, and password security probabilities. The chapter emphasizes the calculation of probabilities for different scenarios using given data and independence assumptions.

Uploaded by

ahmad nadeem
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

Chapter 2 3

Chapter 2

2.1 An outcome is the chosen pair of chips. The sample space in this problem consists of
15 pairs: AB, AC, AD, AE, AF, BC, BD, BE, BF, CD, CE, CF, DE, DF, EF (or 30
pairs if the order of chips in each pair matters, i.e., AB and BA are different pairs).
All the outcomes are equally likely because two chips are chosen at random.
One outcome is ‘favorable’, when both chips in a pair are defective (two such pairs if
the order matters).
Thus,
number of favorable outcomes
P (both chips are defective) = = 1/15
total number of outcomes

2.2 Denote the events:

M = { problems with a motherboard }


H = { problems with a hard drive }

We have:
P {M } = 0.4, P {H} = 0.3, and P {M ∩ H} = 0.15.
Hence,

P {M ∪ H} = P {M } + P {H} − P {M ∩ H} = 0.4 + 0.3 − 0.15 = 0.55,

and
P {fully functioning MB and HD} = 1 − P {M ∪ H} = 0.45

2.3 Denote the events,

I = {the virus enters through the internet}


E = {the virus enters through the e-mail}

Then
¯ =
P {Ē ∩ I} 1 − P {E ∪ I} = 1 − (P {E} + P {I} − P {E ∩ I})
= 1 − (.3 + .4 − .15) = 0.45

It may help to draw a Venn diagram.

2.4 Denote the events,

C = { knows C/C++ } , F = { knows Fortran } .

Then

(a) P F̄ = 1 − P {F } = 1 − 0.6 = 0.4

(b) P F̄ ∩ C̄ = 1 − P {F ∪ C} = 1 − (P {F } + P {C} − P {F ∩ C})
= 1 − (0.7 + 0.6 − 0.5) = 1 − 0.8 = 0.2
(c) P {C\F } = P {C} − P {F ∩ C} = 0.7 − 0.5 = 0.2
4 Instructor’s solution manual

(d) P {F \C} = P {F } − P {F ∩ C} = 0.6 − 0.5 = 0.1


P {C ∩ F } 0.5
(e) P {C | F } = = = 0.8333
P {F } 0.6
P {C ∩ F } 0.5
(f) P {F | C} = = = 0.7143
P {C} 0.7

2.5 Denote the events:

D1 = {first test discovers the error}


D2 = {second test discovers the error}
D3 = {third test discovers the error}

Then

P { at least one discovers } = P {D1 ∪ D2 ∪ D3 }



= 1 − P D̄1 ∩ D̄2 ∩ D̄3
= 1 − (1 − 0.2)(1 − 0.3)(1 − 0.5) = 1 − 0.28 = 0.72

We used the complement rule and independence.

2.6 Let A = {arrive on time}, W = {good weather}. We have



P {A | W } = 0.8, P A | W̄ = 0.3, P {W } = 0.6

By the Law of Total Probability,


 
P {A} = P {A | W } P {W } + P A | W̄ P W̄
= (0.8)(0.6) + (0.3)(0.4) = 0.60

2.7 Organize the data. Let D = {detected}, I = {via internet}, E = {via e-mail} = I.
Notice that the question about detection already assumes that the spyware has entered
the system. This is the sample space, and this is why P {I} + P {E} = 1. We have

P {I} = 0.7, P {E} = 0.3, P {D | I} = 0.6, P {D | E} = 0.8.

By the Law of Total Probability,

P {D} = (0.6)(0.7) + (0.8)(0.3) = 0.66

2.8 Let A1 = {1st device fails}, A2 = {2nd device fails}, A3 = {3rd device fails}.

P { on time } = P { all function }



= P A1 ∩ A2 ∩ A3
  
= P A1 P A2 P A3 (independence)
= (1 − 0.01)(1 − 0.02)(1 − 0.02) (complement rule)
= 0.9508
Chapter 2 5

2.9 P {at least one fails} = 1 − P {all work} = 1 − (.96)(.95)(.90) = 0.1792 .


   
2.10 P {A ∪ B ∪ C} = 1 − P Ā ∩ B̄ ∩ C̄ = 1 − P Ā P B̄ P C̄
= 1 − (1 − 0.4)(1 − 0.5)(1 − 0.2) = 0.76

2.11 (a) P {at least one test finds the error}


= 1 − P {all tests fail to find the error}
= 1 − (1 − 0.1)(1 − 0.2)(1 − 0.3)(1 − 0.4)(1 − 0.5)
= 1 − (0.9)(0.8)(0.7)(0.6)(0.5) = 0.8488
(b) The difference between events in (a) and (b) is the probability that exactly one
test finds an error. This probability equals
P {exactly one test finds the error}
= P {test 1 find the error, the others don’t find}
+P {test 2 find the error, the others don’t find} + . . .
= (0.1)(1 − 0.2)(1 − 0.3)(1 − 0.4)(1 − 0.5)
+(1 − 0.1)(0.2)(1 − 0.3)(1 − 0.4)(1 − 0.5) + . . .
= (0.1)(0.8)(0.7)(0.6)(0.5) + (0.9)(0.2)(0.7)(0.6)(0.5)
+(0.9)(0.8)(0.3)(0.6)(0.5) + (0.9)(0.8)(0.7)(0.4)(0.5)
+(0.9)(0.8)(0.7)(0.6)(0.5) = 0.3714.
Then
P {at least two tests find the error}
= P {at least one test finds the error}
−P {exactly one test finds the error}
= 0.8488 − 0.3714 = 0.4774
(c) P {all tests find the error} = (0.1)(0.2)(0.3)(0.4)(0.5) = 0.0012

2.12 Let Aj = { dog j detects the explosives }.


P {at least one dog detects} = 1 − P {all four dogs don’t detect}
   
= 1 − P Ā1 P Ā2 P Ā3 P Ā4
= 1 − (1 − 0.6)4 = 0.9744

2.13 Let Aj be the event {Team j detects a problem}. Then


P {at least one team detects} = 1 − P {no team detects}
   
= 1 − P Ā1 ∩ Ā2 ∩ Ā3 = 1 − P Ā1 P Ā2 P Ā3
= 1 − (1 − 0.8)(1 − 0.8)(1 − 0.8) = 0.992 .

2.14 (a) The total number of possible passwords is

P (26, 6) = (26)(25)(24)(23)(22)(21) = 165, 765, 600

because there are 26 letters in the alphabet, they should be all different in the
6 Instructor’s solution manual

password, and the order of characters is important. The password is guessed


(favorable outcome) if it is among the 1,000,000 attempted passwords. Then
number of favorable passwords
P {guess the password} =
total number of passwords
1, 000, 000
= = 0.0060
165, 765, 600
(b) Now we can use 52 characters, and the order is still important. Then the total
number of passwords is
P (52, 6) = (52)(51)(50)(49)(48)(47) = 14, 658, 134, 400,
and
1, 000, 000
P {guess the password} = = 0.000068
14, 658, 134, 400
(c) Letters can be repeated in passwords, therefore, the total number of passwords
is
Pr (52, 6) = 526 ,
and
106
P {guess the password} = = 0.000051
526
(d) Adding the digits brings the number of possible characters to 62. Then the total
number of passwords is
Pr (62, 6) = 626 ,
and
106
P {guess the password} = = 0.000018
626
The more characters we use the lower is the probability for a spyware to break
into the system.

2.15 Let A = {Error in the 1st block} and B = {Error in the 2nd block}. Then P {A} =
0.2, P {B} = 0.3, and P {A ∩ B} = 0.06 by independence;
P { error in program } = P {A ∪ B} = 0.2 + 0.3 − 0.06 = 0.44.
Then, by the definition of conditional probability,
P {A ∩ B} 0.06
P {A ∩ B | A ∪ B} = = = 0.1364
P {A ∪ B} 0.44
Or, by the Bayes Rule,
P {A ∪ B | A ∩ B} P {A ∩ B}
P {A ∩ B | A ∪ B} =
P {A ∪ B}
(1)(0.06)
= = 0.1364
0.44

2.16 Organize the data. Let D = {defective part}. We are given:


P {S1} = 0.5 P {D|S1} = 0.05
P {S2} = 0.2 P {D|S2} = 0.03
P {S3} = 0.3 P {D|S3} = 0.06
We need to find P {S1|D}.
Chapter 2 7

(a) By the Law of Total Probability:

P {D} = P {D|S1} P {S1} + P {D|S2} P {S2} + P {D|S3} P {S3}


= (0.5)(0.05) + (0.2)(0.03) + (0.3)(0.06) = 0.049

(b) Bayes Rule:

P {D|S1} P {S1} (0.5)(0.05)


P {S1|D} = = = 25/49 or 0.5102
P {D} 0.49

2.17 Let D = {defective part}. We are given:

P {X} = 0.24 P {D|X} = 0.05


P {Y } = 0.36 P {D|Y } = 0.10
P {Z} = 0.40 P {D|Z} = 0.06

Combine the Bayes Rule and the Law of Total Probability.


P {D|Z} P {Z}
P {Z | D} =
P {D|X} P {X} + P {D|Y } P {Y } + P {D|Z} P {Z}
(0.06)(0.40)
=
(0.05)(0.24) + (0.10)(0.36) + (0.06)(0.40)
= 1/3 or 0.3333

2.18 Let C = { correct }, G = { guessing }. It is given that:


 
P Ḡ = 0.75, P C | Ḡ = 0.9, P {C | G} = 1/4 = 0.25.

Also, P {G} = 1 − 0.75 = 0.25.


Then, by the Bayes Rule,

P {C | G} P {G}
P {G | C} =  
P {C | G} P {G} + P C | Ḡ P Ḡ
(0.25)(0.25)
= = 0.0847
(0.25)(0.25) + (0.9)(0.75)

2.19 Let D = {defective part} and I = {inspected electronically}. By the Bayes Rule,

P {D|I} P {I}
P {I|D} =  
P {D|I} P {I} + P D|I¯ P I¯
(1 − 0.95)(0.20)
= = 0.0400
(1 − 0.95)(0.20) + (1 − 0.7)(1 − 0.20)

2.20 Let S = {steroid user} and N = {test is negative}.


 
It is given that P {S} = 0.05, P N̄ |S = 0.9, and P N̄ |S̄ = 0.02.
 
By the complement rule, P S̄ = 0.95, P {N |S} = 0.1, and P N |S̄ = 0.98.
8 Instructor’s solution manual

By the Bayes Rule,


P {N |S} P {S}
P {S|N } =  
P {N |S} P {S} + P N |S̄ P S̄
(0.1)(0.05)
= = 5/936 or 0.00534
(0.1)(0.05) + (0.98)(0.95)

2.21 At least one of the first three components works with probability

1 − P {all three fail} = 1 − (0.3)3 = 0.973.

At least one of the last two components works with probability

1 − P {both fail} = 1 − (0.3)2 = 0.91.

Hence, the system operates with probability (0.973)(0.91) = 0.8854

2.22 (a) The scheme of cities A, B, and C and all five highways is similar to Exercise 2.21.
Similarly to this exercise, there exists an open route from city A to city C with
probability  
1 − (0.2)3 1 − (0.2)2 = 0.9523
(b-α) If the new highway is built between cities A and B, it will be the 4-th highway
connecting A and B. Then the probability of an open route from city A to city
C becomes  
1 − (0.2)4 1 − (0.2)2 = 0.9585
(b-β) If the new highway is built between B and C, it will be the 3rd highway connecting
these cities. Then the probability of an open route from city A to city C is
 
1 − (0.2)3 1 − (0.2)3 = 0.9841

(b-γ) Finally, if the new highway is built between A and C, then


P { at least one open route from A to C }
 
a new direct route S a route from A to B to C
=
from A to C is open is open, see question (a)
= 1 − (1 − 0.2)(1 − 0.9523) = 0.9618

2.23 (a) (0.9)(0.8) = 0.72


(b) 1 − {1 − (0.9)(0.8)} {1 − (0.7)(0.6)} = 0.8376
(c) 1 − (1 − 0.9)(1 − 0.8)(1 − 0.7) = 0.994
(d) {1 − (1 − 0.9)(1 − 0.7)} {1 − (1 − 0.8)(1 − 0.6)} = 0.8924
(e) {1 − (1 − 0.9)(1 − 0.6)} {1 − (1 − 0.8)(1 − 0.7)(1 − 0.5)} = 0.9312

2.24 A customer is unaware of defects, so he buys 6 random laptops. The outcomes are
equally likely, so each probability can be computed as
number of favorable outcomes
total number of outcomes
Chapter 2 9

5 5 
5·4
2 ·5
2 4 5
(a) P {exactly 2} =   = 10·9·8·7 = or 0.238
10 4·3·2·1
21
6
(b) This is a conditional probability because {X ≥ 2} is given. We need
P {X = 2 ∩ X ≥ 2} P {X = 2}
P {X = 2 | X ≥ 2} = =
P {X ≥ 2} P {X ≥ 2}
where P {X = 2} = 5/21 is already computed in (a), and
  
5 5
1 5 5·1 41
P {x ≥ 2} = 1 − P (X = 1) = 1 −  10  = 1 − 10·9·8·7 =
4·3·2·1
42
6
Notice that P {X = 0} = 0 because there are only 5 good computers, so among
the purchased 6 computers there has to be at least 1 defective. So,
P {X = 2} 5/21
P {X = 2 | X ≥ 2} = = = 10/41 or 0.244 .
P {X ≥ 2} 41/42

2.25 Our sample space consists of birthdays of all N = 30 students. The total number of
outcomes in it is
NT = Pr (365, N ) = 365N .
It is easier to count the outcomes where all students are born on different days. The
number of such outcomes is
365!
NF = P (365, N ) = = (365)(364) . . . (366 − N ).
(365 − N )!
Then
P (N ) = P {at least two share birthdays}
= 1 − P {all born on different days}
    
NF 365 364 366 − N
= 1− =1− ... .
NT 365 365 365
For N = 30, we get
P (30) = 1 − 0.2937 = 0.7063
(b) Evaluating P (N ) for different N , we see that P (22) = 0.4757 and P (23) = 0.5073.
Hence, we need at least 23 students in order to find birthday matches with a
probability above 0.5.

2.26 The sample space consists of all (unordered) sets of three computers selected from
six computers in the lab. Favorable outcomes are sets of three computers with non-
defective hard drives. We have
(6)(5)(4)
NT = C(6, 3) = = 20 and NF = C(4, 3) = 4;
(3)(2)(1)
therefore,
NF 4
P {no hard drive problems} = = = 0.2
NT 20
10 Instructor’s solution manual

2.27 The sample space consists of all unordered sets of five computers selected from 18
computers in the store. Favorable outcomes are sets of five non-defective computers
(that come from a subset of 18 − 6 = 12. Then

(18)(17)(16)(15)(14) (12)(11)(10)(9)(8)
NT = C(18, 5) = and NF = C(12, 5) = ;
(5)(4)(3)(2)(1) (5)(4)(3)(2)(1)

therefore,
 
five computers NF (12)(11)(10)(9)(8) 11
P = = = or 0.0924
without defects NT (18)(17)(16)(15)(14) 119

2.28 The sample space consists of sequences of 6 answers where each answer is one of 4
possible answers, say, A, B, C, or D. Then a sequence of 6 answers is a 6-letter word
written with letters A, B, C, and D with replacement. The student guesses, therefore,
all outcomes are equally likely.
The total number of outcomes is

NT = Pr (4, 6) = 46 = 4096.

Favorable outcomes occur when the student guesses at least 3 answers correctly. This
includes 3, 4, 5, and 6 correct answers. The correctly answered questions are chosen
at random from 6 questions. Then, a correct answer is given to each of the chosen
questions. Also, an incorrect answer to each remaining question is chosen out of 3
possible incorrect answers. Altogether, the number of favorable outcomes is

NF = C(6, 3)(33 ) + C(6, 4)(32 ) + C(6, 5)(31 ) + C(6, 6)(30 )


(6)(5)(4) (6)(5)
= (27) + (9) + (6)(3) + 1 = 694.
(3)(2)(1) (2)(1)

NF 694
P {he will pass} = = = 0.1694
NT 4096
One can also use the complement rule for a little shorter solution.

2.29 Outcomes are sets of four databases selected from nine. Favorable outcomes are such
sets where at least 2 databases have a keyword, out of 5 such databases (and the
remaining ones don’t have a keyword, so they come from the remaining 4 databases).
Then
(9)(8)(7)(6)
NT = C(9, 4) = = 126,
(4)(3)(2)(1)

NF = C(5, 2)C(4, 2) + C(5, 3)C(4, 1) + C(5, 4)C(4, 0)


= (10)(6) + (10)(4) + (5)(1) = 105,

and
NF 105 5
P {at least two have the keyword} = = = or 0.8333
NT 126 6
Chapter 2 11

2.30 (a) All outcomes are listed in the table below. According to the problem, they are
equally likely.
Outcome The older child The younger child Who is met
1 girl girl the older girl
2 girl girl the younger girl
3 girl boy the girl
4 girl boy the boy
5 boy girl the girl
6 boy girl the boy
7 boy boy the older boy
8 boy boy the younger boy

(b) P {BB} = P {outcomes 7, 8} = 1/4,


P {BG} = P {outcomes 5, 6} = 1/4,
P {GB} = P {outcomes 3, 4} = 1/4.
(c) Meeting Jimmy automatically eliminates outcomes 1, 2, 3, and 5. The remaining
outcomes are
Outcome The older child The younger child Who is met
4 girl boy the boy
6 boy girl the boy
7 boy boy the older boy
8 boy boy the younger boy

Two remaining outcomes form the event BB whereas BG and GB have only one
outcome each. Therefore, given that you met a boy,

P {BB | met Jimmy} = P {outcomes 7, 8 | met Jimmy } = 1/2,


P {BG | met Jimmy} = P {outcome 6 | met Jimmy } = 1/4,
P {GB | met Jimmy} = P {outcome 4 | met Jimmy } = 1/4.

(d) P {Jimmy has a sister | met Jimmy}


= P {outcomes 4, 6 | met Jimmy }
= 1/2.

2.31 According to (2.2),


A∩ B ∩C ∩ ... = A∪ B ∪C ∪ ....
Then, events A, B, C, . . . are disjoint (i.e., A ∩ B ∩ C ∩ . . . = ∅) if and only if

A ∪ B ∪ C ∪ . . . = A ∩ B ∩ C ∩ . . . = ∅ = Ω.

We see that the union of A, B, C, . . . equals the entire sample space in this case. By
Definition 2.8 A, B, C, . . . are exhaustive.

2.32 Intuitive solutions:

(a) Independent events A and B occur independently of each other. Hence, they don’t
occur independently of each other. Every time when A (or B) does not occur, its
complement occurs. Hence, the complements of A and B are also independent of
each other.
12 Instructor’s solution manual

(b) Being disjoint is a very strong dependence because disjoint events completely
eliminate each other. The only way for such events to be independent is when
one of these events is always eliminated. Such an event must have probability 0.
(c) Being exhaustive is also a strong type of dependence because one event absolutely
has to cover all the parts of Ω that are not covered by the other event. The only
way for such events to be independent is when one of the events covers all the
parts of Ω regardless of the other event. Such event should be the entire sample
space, Ω.

Mathematical solutions:

(a) Using (2.2),


 
P A∩B = P A ∪ B = 1 − P {A ∪ B}
= 1 − (P {A} + P {B} − P {A ∩ B})
= 1 − P {A} − P {B} + P {A} P {B}
(because A and B are independent)
= (1 − P {A}) (1 − P {B})
 
= P A P B .

Hence, A and B are independent.


(b) If A and B are independent and disjoint, then

0 = P {A ∩ B} = P {A} P {B} ,

which can only happen when P {A} = 0 or P {B} = 0.


(c) If A and B are independent and exhaustive, then

1 = P {A ∪ B} = P {A} + P {B} − P {A ∩ B}
= P {A} + P {B} − P {A} P {B} .

Then

0 = 1 − (P {A} + P {B} − P {A} P {B}) = (1 − P {A}) (1 − P {B}) ,

which can only happen when P {A} = 1 or P {B} = 1.

2.33 Generalizing (2.4), we prove that for any events E1 , . . . , En ,


P {E1 ∪ . . . ∪ En }
X X X
= P {Ei } − P {Ei ∩ Ej } + P {Ei ∩ Ej ∩ Ek } − . . .
i≤n 1≤i<j≤n 1≤i<j<k≤n

− (−1)n P {E1 ∩ . . . ∩ En }.
This can be proved by induction.
For n = 2 events, this formula is given by (2.4).
Suppose the formula is true for n events. Let A denote their overall union, A =
E1 ∪ . . . ∪ En . Then for any event En+1 ,
P {E1 ∪ . . . ∪ En+1 } = P {A ∪ En+1 }
Chapter 2 13

= P {A} + P {En+1 } − P {A ∩ En+1 }


X X X
= P {Ei } − P {Ei ∩ Ej } + P {Ei ∩ Ej ∩ Ek }
i≤n+1 1≤i<j≤n 1≤i<j<k≤n

− . . . − (−1)n P {E1 ∩ . . . ∩ En } − P {A ∩ En+1 }.


Also, since the formula is assumed true for n events,
P {A ∩ En+1 } = P {(E1 ∩ En+1 ) ∪ . . . ∪ (En ∩ En+1 )}
X X
= P {Ei ∩ En+1 } − P {Ei ∩ Ej ∩ En+1 } + . . .
i≤n 1≤i<j≤n

− (−1)n P {E1 ∩ . . . ∩ En+1 }.


Altogether,
P {E1 ∪ . . . ∪ En+1 }
X X X
= P {Ei } − P {Ei ∩ Ej } + P {Ei ∩ Ej ∩ Ek }
i≤n+1 1≤i<j≤n+1 1≤i<j<k≤n+1

− . . . − (−1)n+1 P {E1 ∩ . . . ∩ En+1 }.


This proves the formula for (n + 1) events. By induction, the formula is proved for
any n ≥ 2.

2.34 Let Ai = Ēi for i = 1, . . . , n. According to (2.2),

A1 ∩ . . . ∩ An = A1 ∪ . . . ∪ An

Therefore,

E1 ∩ . . . ∩ En = A1 ∩ . . . ∩ An = A1 ∪ . . . ∪ An = E 1 ∪ . . . ∪ E n

2.35 Events A\B and B are mutually exclusive, and their union is A. Therefore, P {A\B}+
P {B} = P {A}, and P {A\B} = P {A} − P {B}.

2.36 Consider the following events,

A1 = E1 , A2 = E2 \E1 , A3 = E3 \(E1 ∪ E2 ), A4 = E4 \(E1 ∪ E2 ∪ E3 ), . . . .

They are mutually exclusive, Ai ⊂ Ei for all i, and A1 ∪ A2 ∪ . . . = E1 ∪ E2 ∪ . . ..


Then,
X X
P {E1 ∪ E2 ∪ . . .} = P {A1 ∪ A2 ∪ . . .} = P {Ai } ≤ P {Ei } .
i i
14 Instructor’s solution manual

F (x) ✻

1.00
0.88 ✲

0.42 ✲

x


0 1 2
FIGURE 1: The cdf of X for Exercise 3.1

Chapter 3

3.1 Possible values of X are: 0, 1, and 2.

(a) The pmf is:

P (0) = P {both files are not corrupted}


= (1 − 0.4)(1 − 0.3) = 0.42,
   
1st is corrupted, 2nd is corrupted,
P (1) = P +P
2nd is not 1st is not
= (0.4)(1 − 0.3) + (0.3)(1 − 0.4) = 0.46,
P (2) = P {both are corrupted} = (0.4)(0.3) = 0.12.

(check: P (0) + P (1) + P (2) = 1.)


(b) The cdf is given in Figure 1.

3.2 Let X be the number of network blackouts, and Y be the loss. Then Y = 500X.
Compute
X
E(X) = xP (x) = (0)(0.7) + (1)(0.2) + (2)(0.1) = 0.4;
x
X
Var(X) = (x − 0.4)2 P (x)
x
= (0 − 0.4)2 (0.7) + (1 − 0.4)2 (0.2) + (2 − 0.4)2 (0.1) = 0.44.

Hence,
E(Y ) = 500 E(X) = (500)(0.4) = 200 dollars
and

Var(Y ) = 5002 Var(X) = (250, 000)(0.44) = 110,000 squared dollars

You might also like