Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

David Goodman Probability and Stochastic , Transcriptions of Music of India

David Goodman Probability and Stochastic

Typology: Transcriptions

2016/2017

Uploaded on 11/03/2017

oguz-onur-guengoer
oguz-onur-guengoer 🇹🇷

4.5

(2)

2 documents

1 / 443

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
Probability and Stochastic Processes
A Friendly Introduction for Electrical and Computer Engineers
SECOND EDITION
Problem Solutions
September 28, 2005 Draft
Roy D. Yates, David J. Goodman, David Famolari
September 28, 2005
This solution manual remains under construction. The current count is that 678 (out of 687)
problems have solutions. The unsolved problems are
12.1.7, 12.1.8, 12.5.8, 12.5.9, 12.11.5 12.11.9.
If you volunteer a solution for one of those problems, we’ll be happy to include it . . . and, of
course, “your wildest dreams will come true.”
Of course, the correctness of every single solution reamins unconfirmed. If you find errors or
have suggestions or comments, please send email: ryates@winlab.rutgers.edu.
If you need to make solution sets for your class, you might like the Solution Set Constructor
at the instructors site www.winlab.rutgers.edu/probsolns. If you need access, send email:
ryates@winlab.rutgers.edu.
Matlab functions written as solutions to homework problems can be found in the archive
matsoln.zip (available to instructors) or in the directory matsoln.OtherMatlab functions
used in the text or in these homework solutions can be found in the archive matcode.zip
or directory matcode. The .m files in matcode are available for download from the Wiley
website. Two other documents of interest are also available for download:
A manual probmatlab.pdf describing the matcode .m functions is also available.
The quiz solutions manual quizsol.pdf.
A web-based solution set constructor for the second edition is available to instructors at
http://www.winlab.rutgers.edu/probsolns
The next update of this solution manual is likely to occur in January, 2006.
1
pf3
pf4
pf5
pf8
pf9
pfa
pfd
pfe
pff
pf12
pf13
pf14
pf15
pf16
pf17
pf18
pf19
pf1a
pf1b
pf1c
pf1d
pf1e
pf1f
pf20
pf21
pf22
pf23
pf24
pf25
pf26
pf27
pf28
pf29
pf2a
pf2b
pf2c
pf2d
pf2e
pf2f
pf30
pf31
pf32
pf33
pf34
pf35
pf36
pf37
pf38
pf39
pf3a
pf3b
pf3c
pf3d
pf3e
pf3f
pf40
pf41
pf42
pf43
pf44
pf45
pf46
pf47
pf48
pf49
pf4a
pf4b
pf4c
pf4d
pf4e
pf4f
pf50
pf51
pf52
pf53
pf54
pf55
pf56
pf57
pf58
pf59
pf5a
pf5b
pf5c
pf5d
pf5e
pf5f
pf60
pf61
pf62
pf63
pf64

Partial preview of the text

Download David Goodman Probability and Stochastic and more Transcriptions Music of India in PDF only on Docsity!

Probability and Stochastic Processes

A Friendly Introduction for Electrical and Computer Engineers

SECOND EDITION

Problem Solutions

September 28, 2005 Draft

Roy D. Yates, David J. Goodman, David Famolari

September 28, 2005

  • This solution manual remains under construction. The current count is that 678 (out of 687) problems have solutions. The unsolved problems are

12.1.7, 12.1.8, 12.5.8, 12.5.9, 12.11.5 – 12.11.9.

If you volunteer a solution for one of those problems, we’ll be happy to include it... and, of course, “your wildest dreams will come true.”

  • Of course, the correctness of every single solution reamins unconfirmed. If you find errors or have suggestions or comments, please send email: ryates@winlab.rutgers.edu.
  • If you need to make solution sets for your class, you might like the Solution Set Constructor at the instructors site www.winlab.rutgers.edu/probsolns. If you need access, send email: ryates@winlab.rutgers.edu.
  • Matlab functions written as solutions to homework problems can be found in the archive matsoln.zip (available to instructors) or in the directory matsoln. Other Matlab functions used in the text or in these homework solutions can be found in the archive matcode.zip or directory matcode. The .m files in matcode are available for download from the Wiley website. Two other documents of interest are also available for download: - A manual probmatlab.pdf describing the matcode .m functions is also available. - The quiz solutions manual quizsol.pdf.
  • A web-based solution set constructor for the second edition is available to instructors at http://www.winlab.rutgers.edu/probsolns
  • The next update of this solution manual is likely to occur in January, 2006.

Problem Solutions – Chapter 1

Problem 1.1.1 Solution

Based on the Venn diagram

M O

T

the answers are fairly straightforward:

(a) Since T ∩ M = φ, T and M are not mutually exclusive.

(b) Every pizza is either Regular (R), or Tuscan (T ). Hence R ∪ T = S so that R and T are collectively exhaustive. Thus its also (trivially) true that R ∪ T ∪ M = S. That is, R, T and M are also collectively exhaustive.

(c) From the Venn diagram, T and O are mutually exclusive. In words, this means that Tuscan pizzas never have onions or pizzas with onions are never Tuscan. As an aside, “Tuscan” is a fake pizza designation; one shouldn’t conclude that people from Tuscany actually dislike onions.

(d) From the Venn diagram, M ∩ T and O are mutually exclusive. Thus Gerlanda’s doesn’t make Tuscan pizza with mushrooms and onions.

(e) Yes. In terms of the Venn diagram, these pizzas are in the set (T ∪ M ∪ O)c.

Problem 1.1.2 Solution

Based on the Venn diagram,

M O

T

the complete Gerlandas pizza menu is

  • Regular without toppings
  • Regular with mushrooms
  • Regular with onions
  • Regular with mushrooms and onions
  • Tuscan without toppings
  • Tuscan with mushrooms

Problem 1.2.1 Solution

(a) An outcome specifies whether the fax is high (h), medium (m), or low (l) speed, and whether the fax has two (t) pages or four (f ) pages. The sample space is

S = {ht, hf, mt, mf, lt, lf }. (1)

Problem 1.2.4 Solution

The sample space is

S =

The event H defined by the event of a July birthday is described by following 31 sample points. H = { 7 / 1 , 7 / 2 ,... , 7 / 31 }. (2)

Problem 1.2.5 Solution

Of course, there are many answers to this problem. Here are four event spaces.

  1. We can divide students into engineers or non-engineers. Let A 1 equal the set of engineering students and A 2 the non-engineers. The pair {A 1 , A 2 } is an event space.
  2. We can also separate students by GPA. Let Bi denote the subset of students with GPAs G satisfying i − 1 ≤ G < i. At Rutgers, {B 1 , B 2 ,... , B 5 } is an event space. Note that B 5 is the set of all students with perfect 4.0 GPAs. Of course, other schools use different scales for GPA.
  3. We can also divide the students by age. Let Ci denote the subset of students of age i in years. At most universities, {C 10 , C 11 ,... , C 100 } would be an event space. Since a university may have prodigies either under 10 or over 100, we note that {C 0 , C 1 ,.. .} is always an event space
  4. Lastly, we can categorize students by attendance. Let D 0 denote the number of students who have missed zero lectures and let D 1 denote all other students. Although it is likely that D 0 is an empty set, {D 0 , D 1 } is a well defined event space.

Problem 1.2.6 Solution

Let R 1 and R 2 denote the measured resistances. The pair (R 1 , R 2 ) is an outcome of the experiment. Some event spaces include

  1. If we need to check that neither resistance is too high, an event space is A 1 = {R 1 < 100 , R 2 < 100 } , A 2 = {either R 1 ≥ 100 or R 2 ≥ 100 }. (1)
  2. If we need to check whether the first resistance exceeds the second resistance, an event space is B 1 = {R 1 > R 2 } B 2 = {R 1 ≤ R 2 }. (2)
  3. If we need to check whether each resistance doesn’t fall below a minimum value (in this case 50 ohms for R 1 and 100 ohms for R 2 ), an event space is C 1 = {R 1 < 50 , R 2 < 100 } , C 2 = {R 1 < 50 , R 2 ≥ 100 } , (3) C 3 = {R 1 ≥ 50 , R 2 < 100 } , C 4 = {R 1 ≥ 50 , R 2 ≥ 100 }. (4)
  4. If we want to check whether the resistors in parallel are within an acceptable range of 90 to 110 ohms, an event space is D 1 =

(1/R 1 + 1/R 2 )−^1 < 90

D 2 =

90 ≤ (1/R 1 + 1/R 2 )−^1 ≤ 110

D 2 =

110 < (1/R 1 + 1/R 2 )−^1

Problem 1.3.1 Solution

The sample space of the experiment is

S = {LF, BF, LW, BW }. (1)

From the problem statement, we know that P [LF ] = 0.5, P [BF ] = 0.2 and P [BW ] = 0.2. This implies P [LW ] = 1 − 0. 5 − 0. 2 − 0 .2 = 0.1. The questions can be answered using Theorem 1.5.

(a) The probability that a program is slow is

P [W ] = P [LW ] + P [BW ] = 0.1 + 0.2 = 0. 3. (2)

(b) The probability that a program is big is

P [B] = P [BF ] + P [BW ] = 0.2 + 0.2 = 0. 4. (3)

(c) The probability that a program is slow or big is

P [W ∪ B] = P [W ] + P [B] − P [BW ] = 0.3 + 0. 4 − 0 .2 = 0. 5. (4)

Problem 1.3.2 Solution

A sample outcome indicates whether the cell phone is handheld (H) or mobile (M ) and whether the speed is fast (F ) or slow (W ). The sample space is

S = {HF, HW, M F, M W }. (1)

The problem statement tells us that P [HF ] = 0.2, P [M W ] = 0.1 and P [F ] = 0.5. We can use these facts to find the probabilities of the other outcomes. In particular,

P [F ] = P [HF ] + P [M F ]. (2)

This implies P [M F ] = P [F ] − P [HF ] = 0. 5 − 0 .2 = 0. 3. (3)

Also, since the probabilities must sum to 1,

P [HW ] = 1 − P [HF ] − P [M F ] − P [M W ] = 1 − 0. 2 − 0. 3 − 0 .1 = 0. 4. (4)

Now that we have found the probabilities of the outcomes, finding any other probability is easy.

(a) The probability a cell phone is slow is

P [W ] = P [HW ] + P [M W ] = 0.4 + 0.1 = 0. 5. (5)

(b) The probability that a cell hpone is mobile and fast is P [M F ] = 0.3.

(c) The probability that a cell phone is handheld is

P [H] = P [HF ] + P [HW ] = 0.2 + 0.4 = 0. 6. (6)

Problem 1.4.2 Solution

(a) From the given probability distribution of billed minutes, M , the probability that a call is billed for more than 3 minutes is

P [L] = 1 − P [3 or fewer billed minutes] (1) = 1 − P [B 1 ] − P [B 2 ] − P [B 3 ] (2) = 1 − α − α(1 − α) − α(1 − α)^2 (3) = (1 − α)^3 = 0. 57. (4)

(b) The probability that a call will billed for 9 minutes or less is

P [9 minutes or less] =

∑^9

i=

α(1 − α)i−^1 = 1 − (0.57)^3. (5)

Problem 1.4.3 Solution

The first generation consists of two plants each with genotype yg or gy. They are crossed to produce the following second generation genotypes, S = {yy, yg, gy, gg}. Each genotype is just as likely as any other so the probability of each genotype is consequently 1/4. A pea plant has yellow seeds if it possesses at least one dominant y gene. The set of pea plants with yellow seeds is

Y = {yy, yg, gy}. (1)

So the probability of a pea plant with yellow seeds is

P [Y ] = P [yy] + P [yg] + P [gy] = 3/ 4. (2)

Problem 1.4.4 Solution

Each statement is a consequence of part 4 of Theorem 1.4.

(a) Since A ⊂ A ∪ B, P [A] ≤ P [A ∪ B].

(b) Since B ⊂ A ∪ B, P [B] ≤ P [A ∪ B].

(c) Since A ∩ B ⊂ A, P [A ∩ B] ≤ P [A].

(d) Since A ∩ B ⊂ B, P [A ∩ B] ≤ P [B].

Problem 1.4.5 Solution

Specifically, we will use Theorem 1.7(c) which states that for any events A and B,

P [A ∪ B] = P [A] + P [B] − P [A ∩ B]. (1)

To prove the union bound by induction, we first prove the theorem for the case of n = 2 events. In this case, by Theorem 1.7(c),

P [A 1 ∪ A 2 ] = P [A 1 ] + P [A 2 ] − P [A 1 ∩ A 2 ]. (2)

By the first axiom of probability, P [A 1 ∩ A 2 ] ≥ 0. Thus,

P [A 1 ∪ A 2 ] ≤ P [A 1 ] + P [A 2 ]. (3)

which proves the union bound for the case n = 2. Now we make our induction hypothesis that the union-bound holds for any collection of n − 1 subsets. In this case, given subsets A 1 ,... , An, we define A = A 1 ∪ A 2 ∪ · · · ∪ An− 1 , B = An. (4)

By our induction hypothesis,

P [A] = P [A 1 ∪ A 2 ∪ · · · ∪ An− 1 ] ≤ P [A 1 ] + · · · + P [An− 1 ]. (5)

This permits us to write

P [A 1 ∪ · · · ∪ An] = P [A ∪ B] (6) ≤ P [A] + P [B] (by the union bound for n = 2) (7) = P [A 1 ∪ · · · ∪ An− 1 ] + P [An] (8) ≤ P [A 1 ] + · · · P [An− 1 ] + P [An] (9)

which completes the inductive proof.

Problem 1.4.6 Solution

(a) For convenience, let pi = P [F Hi] and qi = P [V Hi]. Using this shorthand, the six unknowns p 0 , p 1 , p 2 , q 0 , q 1 , q 2 fill the table as

H 0 H 1 H 2 F p 0 p 1 p 2 V q 0 q 1 q 2

However, we are given a number of facts:

p 0 + q 0 = 1/ 3 , p 1 + q 1 = 1/ 3 , (2) p 2 + q 2 = 1/ 3 , p 0 + p 1 + p 2 = 5/ 12. (3)

Other facts, such as q 0 + q 1 + q 2 = 7/12, can be derived from these facts. Thus, we have four equations and six unknowns, choosing p 0 and p 1 will specify the other unknowns. Un- fortunately, arbitrary choices for either p 0 or p 1 will lead to negative values for the other probabilities. In terms of p 0 and p 1 , the other unknowns are

q 0 = 1/ 3 − p 0 , p 2 = 5/ 12 − (p 0 + p 1 ), (4) q 1 = 1/ 3 − p 1 , q 2 = p 0 + p 1 − 1 / 12. (5)

Because the probabilities must be nonnegative, we see that

0 ≤ p 0 ≤ 1 / 3 , (6) 0 ≤ p 1 ≤ 1 / 3 , (7) 1 / 12 ≤ p 0 + p 1 ≤ 5 / 12. (8)

Problem 1.4.8 Solution

Following the hint, we define the set of events {Ai|i = 1, 2 ,.. .} such that i = 1,... , m, Ai = Bi and for i > m, Ai = φ. By construction, ∪mi=1Bi = ∪∞ i=1Ai. Axiom 3 then implies

P [∪mi=1Bi] = P [∪∞ i=1Ai] =

∑^ ∞

i=

P [Ai]. (1)

For i > m, P [Ai] = P [φ] = 0, yielding the claim P [∪mi=1Bi] =

∑m i=1 P^ [Ai] =^

∑m i=1 P^ [Bi]. Note that the fact that P [φ] = 0 follows from Axioms 1 and 2. This problem is more challenging if you just use Axiom 3. We start by observing

P [∪mi=1Bi] =

m∑− 1

i=

P [Bi] +

∑^ ∞

i=m

P [Ai]. (2)

Now, we use Axiom 3 again on the countably infinite sequence Am, Am+1,... to write

∑^ ∞

i=m

P [Ai] = P [Am ∪ Am+1 ∪ · · ·] = P [Bm]. (3)

Thus, we have used just Axiom 3 to prove Theorem 1.4: P [∪mi=1Bi] =

∑m i=1 P^ [Bi].

Problem 1.4.9 Solution

Each claim in Theorem 1.7 requires a proof from which we can check which axioms are used. However, the problem is somewhat hard because there may still be a simpler proof that uses fewer axioms. Still, the proof of each part will need Theorem 1.4 which we now prove. For the mutually exclusive events B 1 ,... , Bm, let Ai = Bi for i = 1,... , m and let Ai = φ for i > m. In that case, by Axiom 3,

P [B 1 ∪ B 2 ∪ · · · ∪ Bm] = P [A 1 ∪ A 2 ∪ · · ·] (1)

=

m∑− 1

i=

P [Ai] +

∑^ ∞

i=m

P [Ai] (2)

m∑− 1

i=

P [Bi] +

∑^ ∞

i=m

P [Ai]. (3)

Now, we use Axiom 3 again on Am, Am+1,... to write

∑^ ∞

i=m

P [Ai] = P [Am ∪ Am+1 ∪ · · ·] = P [Bm]. (4)

Thus, we have used just Axiom 3 to prove Theorem 1.4:

P [B 1 ∪ B 2 ∪ · · · ∪ Bm] =

∑^ m

i=

P [Bi]. (5)

(a) To show P [φ] = 0, let B 1 = S and let B 2 = φ. Thus by Theorem 1.4,

P [S] = P [B 1 ∪ B 2 ] = P [B 1 ] + P [B 2 ] = P [S] + P [φ]. (6)

Thus, P [φ] = 0. Note that this proof uses only Theorem 1.4 which uses only Axiom 3.

(b) Using Theorem 1.4 with B 1 = A and B 2 = Ac, we have P [S] = P [A ∪ Ac] = P [A] + P [Ac]. (7) Since, Axiom 2 says P [S] = 1, P [Ac] = 1 − P [A]. This proof uses Axioms 2 and 3. (c) By Theorem 1.2, we can write both A and B as unions of disjoint events: A = (AB) ∪ (ABc) B = (AB) ∪ (AcB). (8) Now we apply Theorem 1.4 to write P [A] = P [AB] + P [ABc] , P [B] = P [AB] + P [AcB]. (9) We can rewrite these facts as P [ABc] = P [A] − P [AB], P [AcB] = P [B] − P [AB]. (10) Note that so far we have used only Axiom 3. Finally, we observe that A ∪ B can be written as the union of mutually exclusive events A ∪ B = (AB) ∪ (ABc) ∪ (AcB). (11) Once again, using Theorem 1.4, we have P [A ∪ B] = P [AB] + P [ABc] + P [AcB] (12) Substituting the results of Equation (10) into Equation (12) yields P [A ∪ B] = P [AB] + P [A] − P [AB] + P [B] − P [AB] , (13) which completes the proof. Note that this claim required only Axiom 3. (d) Observe that since A ⊂ B, we can write B as the disjoint union B = A ∪ (AcB). By Theorem 1.4 (which uses Axiom 3), P [B] = P [A] + P [AcB]. (14) By Axiom 1, P [AcB] ≥ 0, hich implies P [A] ≤ P [B]. This proof uses Axioms 1 and 3.

Problem 1.5.1 Solution

Each question requests a conditional probability.

(a) Note that the probability a call is brief is P [B] = P [H 0 B] + P [H 1 B] + P [H 2 B] = 0. 6. (1) The probability a brief call will have no handoffs is

P [H 0 |B] =

P [H 0 B]

P [B]

(b) The probability of one handoff is P [H 1 ] = P [H 1 B] + P [H 1 L] = 0.2. The probability that a call with one handoff will be long is

P [L|H 1 ] =

P [H 1 L]

P [H 1 ]

(c) The probability a call is long is P [L] = 1 − P [B] = 0.4. The probability that a long call will have one or more handoffs is

P [H 1 ∪ H 2 |L] =

P [H 1 L ∪ H 2 L]

P [L]

P [H 1 L] + P [H 2 L]

P [L]

Problem 1.5.5 Solution

The sample outcomes can be written ijk where the first card drawn is i, the second is j and the third is k. The sample space is

S = { 234 , 243 , 324 , 342 , 423 , 432 }. (1)

and each of the six outcomes has probability 1/6. The events E 1 , E 2 , E 3 , O 1 , O 2 , O 3 are

E 1 = { 234 , 243 , 423 , 432 } , O 1 = { 324 , 342 } , (2) E 2 = { 243 , 324 , 342 , 423 } , O 2 = { 234 , 432 } , (3) E 3 = { 234 , 324 , 342 , 432 } , O 3 = { 243 , 423 }. (4) (a) The conditional probability the second card is even given that the first card is even is

P [E 2 |E 1 ] =

P [E 2 E 1 ]

P [E 1 ]

P [243, 423]

P [234, 243 , 423 , 432]

(b) The conditional probability the first card is even given that the second card is even is

P [E 1 |E 2 ] =

P [E 1 E 2 ]

P [E 2 ]

P [243, 423]

P [243, 324 , 342 , 423]

(c) The probability the first two cards are even given the third card is even is

P [E 1 E 2 |E 3 ] =

P [E 1 E 2 E 3 ]

P [E 3 ]

(d) The conditional probabilities the second card is even given that the first card is odd is

P [E 2 |O 1 ] =

P [O 1 E 2 ]

P [O 1 ]

P [O 1 ]

P [O 1 ]

(e) The conditional probability the second card is odd given that the first card is odd is

P [O 2 |O 1 ] =

P [O 1 O 2 ]

P [O 1 ]

Problem 1.5.6 Solution

The problem statement yields the obvious facts that P [L] = 0.16 and P [H] = 0.10. The words “10% of the ticks that had either Lyme disease or HGE carried both diseases” can be written as

P [LH|L ∪ H] = 0. 10. (1) (a) Since LH ⊂ L ∪ H,

P [LH|L ∪ H] =

P [LH ∩ (L ∪ H)]

P [L ∪ H]

P [LH]

P [L ∪ H]

Thus, P [LH] = 0. 10 P [L ∪ H] = 0.10 (P [L] + P [H] − P [LH]). (3) Since P [L] = 0.16 and P [H] = 0.10,

P [LH] =

(b) The conditional probability that a tick has HGE given that it has Lyme disease is

P [H|L] =

P [LH]

P [L]

Problem 1.6.1 Solution

This problems asks whether A and B can be independent events yet satisfy A = B? By definition, events A and B are independent if and only if P [AB] = P [A]P [B]. We can see that if A = B, that is they are the same set, then

P [AB] = P [AA] = P [A] = P [B]. (1)

Thus, for A and B to be the same set and also independent,

P [A] = P [AB] = P [A] P [B] = (P [A])^2. (2)

There are two ways that this requirement can be satisfied:

  • P [A] = 1 implying A = B = S.
  • P [A] = 0 implying A = B = φ.

Problem 1.6.2 Solution

A

B

In the Venn diagram, assume the sample space has area 1 correspond- ing to probability 1. As drawn, both A and B have area 1/4 so that P [A] = P [B] = 1/4. Moreover, the intersection AB has area 1/ 16 and covers 1/4 of A and 1/4 of B. That is, A and B are independent since P [AB] = P [A] P [B]. (1)

Problem 1.6.3 Solution

(a) Since A and B are disjoint, P [A ∩ B] = 0. Since P [A ∩ B] = 0,

P [A ∪ B] = P [A] + P [B] − P [A ∩ B] = 3/ 8. (1)

A Venn diagram should convince you that A ⊂ Bc^ so that A ∩ Bc^ = A. This implies

P [A ∩ Bc] = P [A] = 1/ 4. (2)

It also follows that P [A ∪ Bc] = P [Bc] = 1 − 1 /8 = 7/8.

(b) Events A and B are dependent since P [AB] = P [A]P [B].

Problem 1.6.5 Solution

For a sample space S = { 1 , 2 , 3 , 4 } with equiprobable outcomes, consider the events

A 1 = { 1 , 2 } A 2 = { 2 , 3 } A 3 = { 3 , 1 }. (1)

Each event Ai has probability 1/2. Moreover, each pair of events is independent since

P [A 1 A 2 ] = P [A 2 A 3 ] = P [A 3 A 1 ] = 1/ 4. (2)

However, the three events A 1 , A 2 , A 3 are not independent since

P [A 1 A 2 A 3 ] = 0 = P [A 1 ] P [A 2 ] P [A 3 ]. (3)

Problem 1.6.6 Solution

There are 16 distinct equally likely outcomes for the second generation of pea plants based on a first generation of {rwyg, rwgy, wryg, wrgy}. They are listed below

rryy rryg rrgy rrgg rwyy rwyg rwgy rwgg wryy wryg wrgy wrgg wwyy wwyg wwgy wwgg

A plant has yellow seeds, that is event Y occurs, if a plant has at least one dominant y gene. Except for the four outcomes with a pair of recessive g genes, the remaining 12 outcomes have yellow seeds. From the above, we see that P [Y ] = 12/16 = 3/ 4 (2)

and P [R] = 12/16 = 3/ 4. (3) To find the conditional probabilities P [R|Y ] and P [Y |R], we first must find P [RY ]. Note that RY , the event that a plant has rounded yellow seeds, is the set of outcomes

RY = {rryy, rryg, rrgy, rwyy, rwyg, rwgy, wryy, wryg, wrgy}. (4)

Since P [RY ] = 9/16,

P [Y |R ] =

P [RY ]

P [R]

and

P [R |Y ] =

P [RY ]

P [Y ]

Thus P [R|Y ] = P [R] and P [Y |R] = P [Y ] and R and Y are independent events. There are four visibly different pea plants, corresponding to whether the peas are round (R) or not (Rc), or yellow (Y ) or not (Y c). These four visible events have probabilities

P [RY ] = 9/ 16 P [RY c] = 3/ 16 , (7) P [RcY ] = 3/ 16 P [RcY c] = 1/ 16. (8)

Problem 1.6.7 Solution

(a) For any events A and B, we can write the law of total probability in the form of

P [A] = P [AB] + P [ABc]. (1)

Since A and B are independent, P [AB] = P [A]P [B]. This implies

P [ABc] = P [A] − P [A] P [B] = P [A] (1 − P [B]) = P [A] P [Bc]. (2)

Thus A and Bc^ are independent.

(b) Proving that Ac^ and B are independent is not really necessary. Since A and B are arbitrary labels, it is really the same claim as in part (a). That is, simply reversing the labels of A and B proves the claim. Alternatively, one can construct exactly the same proof as in part (a) with the labels A and B reversed.

(c) To prove that Ac^ and Bc^ are independent, we apply the result of part (a) to the sets A and Bc. Since we know from part (a) that A and Bc^ are independent, part (b) says that Ac^ and Bc^ are independent.

Problem 1.6.8 Solution

A (^) AC

AB ABC C

B BC

In the Venn diagram at right, assume the sample space has area 1 cor- responding to probability 1. As drawn, A, B, and C each have area 1/ 2 and thus probability 1/2. Moreover, the three way intersection ABC has probability 1/8. Thus A, B, and C are mutually independent since

P [ABC] = P [A] P [B] P [C]. (1)

Problem 1.6.9 Solution

A AB B

AC C BC

In the Venn diagram at right, assume the sample space has area 1 cor- responding to probability 1. As drawn, A, B, and C each have area 1 /3 and thus probability 1/3. The three way intersection ABC has zero probability, implying A, B, and C are not mutually independent since

P [ABC] = 0 = P [A] P [B] P [C]. (1)

However, AB, BC, and AC each has area 1/9. As a result, each pair of events is independent since P [AB] = P [A] P [B] , P [BC] = P [B] P [C] , P [AC] = P [A] P [C]. (2)





1 / (^2)  G^1

   (^) B 1 / (^21)



3 / (^4)  G 2  1 / 4 ^ B 2



1 / (^4)  G 2  3 / 4 ^ B 2

  • G 1 G 2 3 / 8
  • G 1 B 2 1 / 8
  • B 1 G 2 1 / 8
  • B 1 B 2 3 / 8

The game goes into overtime if exactly one free throw is made. This event has probability

P [O] = P [G 1 B 2 ] + P [B 1 G 2 ] = 1/8 + 1/8 = 1/ 4. (1)

Problem 1.7.4 Solution

The tree for this experiment is



1 / (^2)  (^) A  1 / 2 ^ B



1 / (^4)  H 3 / 4 T

^3 /^4 H 1 / 4 ^ T

  • AH 1 / 8
  • AT 3 / 8
  • BH 3 / 8
  • BT 1 / 8

The probability that you guess correctly is

P [C] = P [AT ] + P [BH] = 3/8 + 3/8 = 3/ 4. (1)

Problem 1.7.5 Solution

The P [− |H ] is the probability that a person who has HIV tests negative for the disease. This is referred to as a false-negative result. The case where a person who does not have HIV but tests positive for the disease, is called a false-positive result and has probability P [+|Hc]. Since the test is correct 99% of the time, P [−|H] = P [+|Hc] = 0. 01. (1)

Now the probability that a person who has tested positive for HIV actually has the disease is

P [H|+] =

P [+, H]

P [+]

P [+, H]

P [+, H] + P [+, Hc]

We can use Bayes’ formula to evaluate these joint probabilities.

P [H|+] =

P [+|H] P [H]

P [+|H] P [H] + P [+|Hc] P [Hc]

Thus, even though the test is correct 99% of the time, the probability that a random person who tests positive actually has HIV is less than 0.02. The reason this probability is so low is that the a priori probability that a person has HIV is very small.

Problem 1.7.6 Solution

Let Ai and Di indicate whether the ith photodetector is acceptable or defective.



3 / (^5)  A 1  2 / 5 ^ D 1



4 / (^5)  A 2 1 / 5 D^2

^2 /^5 A 2 3 / 5 ^ D 2

  • A 1 A 2 12 / 25
  • A 1 D 2 3 / 25
  • D 1 A 2 4 / 25
  • D 1 D 2 6 / 25

(a) We wish to find the probability P [E 1 ] that exactly one photodetector is acceptable. From the tree, we have

P [E 1 ] = P [A 1 D 2 ] + P [D 1 A 2 ] = 3/25 + 4/25 = 7/ 25. (1)

(b) The probability that both photodetectors are defective is P [D 1 D 2 ] = 6/25.

Problem 1.7.7 Solution

The tree for this experiment is





1 / (^2)  A^1

   (^) B 1 / (^21)



1 / (^4)  H 1 3 / 4 T^1

^3 /^4 H 1 1 / 4 ^ T 1



3 / (^4)  H 2 1 / 4 T^2 ^3 /^4 H 2 1 / 4 ^ T 2



1 / (^4)  H 2 3 / 4 T^2 ^1 /^4 H 2 3 / 4 ^ T 2

  • A 1 H 1 H 2 3 / 32
  • A 1 H 1 T 2 1 / 32
  • A 1 T 1 H 2 9 / 32
  • A 1 T 1 T 2 3 / 32
  • B 1 H 1 H 2 3 / 32
  • B 1 H 1 T 2 9 / 32
  • B 1 T 1 H 2 1 / 32
  • B 1 T 1 T 2 3 / 32

The event H 1 H 2 that heads occurs on both flips has probability

P [H 1 H 2 ] = P [A 1 H 1 H 2 ] + P [B 1 H 1 H 2 ] = 6/ 32. (1)

The probability of H 1 is

P [H 1 ] = P [A 1 H 1 H 2 ] + P [A 1 H 1 T 2 ] + P [B 1 H 1 H 2 ] + P [B 1 H 1 T 2 ] = 1/ 2. (2)

Similarly,

P [H 2 ] = P [A 1 H 1 H 2 ] + P [A 1 T 1 H 2 ] + P [B 1 H 1 H 2 ] + P [B 1 T 1 H 2 ] = 1/ 2. (3)

Thus P [H 1 H 2 ] = P [H 1 ]P [H 2 ], implying H 1 and H 2 are not independent. This result should not be surprising since if the first flip is heads, it is likely that coin B was picked first. In this case, the second flip is less likely to be heads since it becomes more likely that the second coin flipped was coin A.