




























































































Study with the several resources on Docsity
Earn points by helping other students or get them with a premium plan
Prepare for your exams
Study with the several resources on Docsity
Earn points to download
Earn points by helping other students or get them with a premium plan
Community
Ask the community for help and clear up your study doubts
Discover the best universities in your country according to Docsity users
Free resources
Download our free guides on studying techniques, anxiety management strategies, and thesis advice from Docsity tutors
David Goodman Probability and Stochastic
Typology: Transcriptions
1 / 443
This page cannot be seen from the preview
Don't miss anything!
12.1.7, 12.1.8, 12.5.8, 12.5.9, 12.11.5 – 12.11.9.
If you volunteer a solution for one of those problems, we’ll be happy to include it... and, of course, “your wildest dreams will come true.”
Based on the Venn diagram
M O
T
the answers are fairly straightforward:
(a) Since T ∩ M = φ, T and M are not mutually exclusive.
(b) Every pizza is either Regular (R), or Tuscan (T ). Hence R ∪ T = S so that R and T are collectively exhaustive. Thus its also (trivially) true that R ∪ T ∪ M = S. That is, R, T and M are also collectively exhaustive.
(c) From the Venn diagram, T and O are mutually exclusive. In words, this means that Tuscan pizzas never have onions or pizzas with onions are never Tuscan. As an aside, “Tuscan” is a fake pizza designation; one shouldn’t conclude that people from Tuscany actually dislike onions.
(d) From the Venn diagram, M ∩ T and O are mutually exclusive. Thus Gerlanda’s doesn’t make Tuscan pizza with mushrooms and onions.
(e) Yes. In terms of the Venn diagram, these pizzas are in the set (T ∪ M ∪ O)c.
Based on the Venn diagram,
M O
T
the complete Gerlandas pizza menu is
(a) An outcome specifies whether the fax is high (h), medium (m), or low (l) speed, and whether the fax has two (t) pages or four (f ) pages. The sample space is
S = {ht, hf, mt, mf, lt, lf }. (1)
The sample space is
The event H defined by the event of a July birthday is described by following 31 sample points. H = { 7 / 1 , 7 / 2 ,... , 7 / 31 }. (2)
Of course, there are many answers to this problem. Here are four event spaces.
Let R 1 and R 2 denote the measured resistances. The pair (R 1 , R 2 ) is an outcome of the experiment. Some event spaces include
The sample space of the experiment is
S = {LF, BF, LW, BW }. (1)
From the problem statement, we know that P [LF ] = 0.5, P [BF ] = 0.2 and P [BW ] = 0.2. This implies P [LW ] = 1 − 0. 5 − 0. 2 − 0 .2 = 0.1. The questions can be answered using Theorem 1.5.
(a) The probability that a program is slow is
P [W ] = P [LW ] + P [BW ] = 0.1 + 0.2 = 0. 3. (2)
(b) The probability that a program is big is
P [B] = P [BF ] + P [BW ] = 0.2 + 0.2 = 0. 4. (3)
(c) The probability that a program is slow or big is
P [W ∪ B] = P [W ] + P [B] − P [BW ] = 0.3 + 0. 4 − 0 .2 = 0. 5. (4)
A sample outcome indicates whether the cell phone is handheld (H) or mobile (M ) and whether the speed is fast (F ) or slow (W ). The sample space is
S = {HF, HW, M F, M W }. (1)
The problem statement tells us that P [HF ] = 0.2, P [M W ] = 0.1 and P [F ] = 0.5. We can use these facts to find the probabilities of the other outcomes. In particular,
P [F ] = P [HF ] + P [M F ]. (2)
This implies P [M F ] = P [F ] − P [HF ] = 0. 5 − 0 .2 = 0. 3. (3)
Also, since the probabilities must sum to 1,
P [HW ] = 1 − P [HF ] − P [M F ] − P [M W ] = 1 − 0. 2 − 0. 3 − 0 .1 = 0. 4. (4)
Now that we have found the probabilities of the outcomes, finding any other probability is easy.
(a) The probability a cell phone is slow is
P [W ] = P [HW ] + P [M W ] = 0.4 + 0.1 = 0. 5. (5)
(b) The probability that a cell hpone is mobile and fast is P [M F ] = 0.3.
(c) The probability that a cell phone is handheld is
P [H] = P [HF ] + P [HW ] = 0.2 + 0.4 = 0. 6. (6)
(a) From the given probability distribution of billed minutes, M , the probability that a call is billed for more than 3 minutes is
P [L] = 1 − P [3 or fewer billed minutes] (1) = 1 − P [B 1 ] − P [B 2 ] − P [B 3 ] (2) = 1 − α − α(1 − α) − α(1 − α)^2 (3) = (1 − α)^3 = 0. 57. (4)
(b) The probability that a call will billed for 9 minutes or less is
P [9 minutes or less] =
i=
α(1 − α)i−^1 = 1 − (0.57)^3. (5)
The first generation consists of two plants each with genotype yg or gy. They are crossed to produce the following second generation genotypes, S = {yy, yg, gy, gg}. Each genotype is just as likely as any other so the probability of each genotype is consequently 1/4. A pea plant has yellow seeds if it possesses at least one dominant y gene. The set of pea plants with yellow seeds is
Y = {yy, yg, gy}. (1)
So the probability of a pea plant with yellow seeds is
P [Y ] = P [yy] + P [yg] + P [gy] = 3/ 4. (2)
Each statement is a consequence of part 4 of Theorem 1.4.
(a) Since A ⊂ A ∪ B, P [A] ≤ P [A ∪ B].
(b) Since B ⊂ A ∪ B, P [B] ≤ P [A ∪ B].
(c) Since A ∩ B ⊂ A, P [A ∩ B] ≤ P [A].
(d) Since A ∩ B ⊂ B, P [A ∩ B] ≤ P [B].
Specifically, we will use Theorem 1.7(c) which states that for any events A and B,
P [A ∪ B] = P [A] + P [B] − P [A ∩ B]. (1)
To prove the union bound by induction, we first prove the theorem for the case of n = 2 events. In this case, by Theorem 1.7(c),
P [A 1 ∪ A 2 ] = P [A 1 ] + P [A 2 ] − P [A 1 ∩ A 2 ]. (2)
By the first axiom of probability, P [A 1 ∩ A 2 ] ≥ 0. Thus,
P [A 1 ∪ A 2 ] ≤ P [A 1 ] + P [A 2 ]. (3)
which proves the union bound for the case n = 2. Now we make our induction hypothesis that the union-bound holds for any collection of n − 1 subsets. In this case, given subsets A 1 ,... , An, we define A = A 1 ∪ A 2 ∪ · · · ∪ An− 1 , B = An. (4)
By our induction hypothesis,
P [A] = P [A 1 ∪ A 2 ∪ · · · ∪ An− 1 ] ≤ P [A 1 ] + · · · + P [An− 1 ]. (5)
This permits us to write
P [A 1 ∪ · · · ∪ An] = P [A ∪ B] (6) ≤ P [A] + P [B] (by the union bound for n = 2) (7) = P [A 1 ∪ · · · ∪ An− 1 ] + P [An] (8) ≤ P [A 1 ] + · · · P [An− 1 ] + P [An] (9)
which completes the inductive proof.
(a) For convenience, let pi = P [F Hi] and qi = P [V Hi]. Using this shorthand, the six unknowns p 0 , p 1 , p 2 , q 0 , q 1 , q 2 fill the table as
H 0 H 1 H 2 F p 0 p 1 p 2 V q 0 q 1 q 2
However, we are given a number of facts:
p 0 + q 0 = 1/ 3 , p 1 + q 1 = 1/ 3 , (2) p 2 + q 2 = 1/ 3 , p 0 + p 1 + p 2 = 5/ 12. (3)
Other facts, such as q 0 + q 1 + q 2 = 7/12, can be derived from these facts. Thus, we have four equations and six unknowns, choosing p 0 and p 1 will specify the other unknowns. Un- fortunately, arbitrary choices for either p 0 or p 1 will lead to negative values for the other probabilities. In terms of p 0 and p 1 , the other unknowns are
q 0 = 1/ 3 − p 0 , p 2 = 5/ 12 − (p 0 + p 1 ), (4) q 1 = 1/ 3 − p 1 , q 2 = p 0 + p 1 − 1 / 12. (5)
Because the probabilities must be nonnegative, we see that
0 ≤ p 0 ≤ 1 / 3 , (6) 0 ≤ p 1 ≤ 1 / 3 , (7) 1 / 12 ≤ p 0 + p 1 ≤ 5 / 12. (8)
Following the hint, we define the set of events {Ai|i = 1, 2 ,.. .} such that i = 1,... , m, Ai = Bi and for i > m, Ai = φ. By construction, ∪mi=1Bi = ∪∞ i=1Ai. Axiom 3 then implies
P [∪mi=1Bi] = P [∪∞ i=1Ai] =
i=
P [Ai]. (1)
For i > m, P [Ai] = P [φ] = 0, yielding the claim P [∪mi=1Bi] =
∑m i=1 P^ [Ai] =^
∑m i=1 P^ [Bi]. Note that the fact that P [φ] = 0 follows from Axioms 1 and 2. This problem is more challenging if you just use Axiom 3. We start by observing
P [∪mi=1Bi] =
m∑− 1
i=
P [Bi] +
i=m
P [Ai]. (2)
Now, we use Axiom 3 again on the countably infinite sequence Am, Am+1,... to write
∑^ ∞
i=m
P [Ai] = P [Am ∪ Am+1 ∪ · · ·] = P [Bm]. (3)
Thus, we have used just Axiom 3 to prove Theorem 1.4: P [∪mi=1Bi] =
∑m i=1 P^ [Bi].
Each claim in Theorem 1.7 requires a proof from which we can check which axioms are used. However, the problem is somewhat hard because there may still be a simpler proof that uses fewer axioms. Still, the proof of each part will need Theorem 1.4 which we now prove. For the mutually exclusive events B 1 ,... , Bm, let Ai = Bi for i = 1,... , m and let Ai = φ for i > m. In that case, by Axiom 3,
P [B 1 ∪ B 2 ∪ · · · ∪ Bm] = P [A 1 ∪ A 2 ∪ · · ·] (1)
=
m∑− 1
i=
P [Ai] +
i=m
P [Ai] (2)
m∑− 1
i=
P [Bi] +
i=m
P [Ai]. (3)
Now, we use Axiom 3 again on Am, Am+1,... to write
∑^ ∞
i=m
P [Ai] = P [Am ∪ Am+1 ∪ · · ·] = P [Bm]. (4)
Thus, we have used just Axiom 3 to prove Theorem 1.4:
P [B 1 ∪ B 2 ∪ · · · ∪ Bm] =
∑^ m
i=
P [Bi]. (5)
(a) To show P [φ] = 0, let B 1 = S and let B 2 = φ. Thus by Theorem 1.4,
P [S] = P [B 1 ∪ B 2 ] = P [B 1 ] + P [B 2 ] = P [S] + P [φ]. (6)
Thus, P [φ] = 0. Note that this proof uses only Theorem 1.4 which uses only Axiom 3.
(b) Using Theorem 1.4 with B 1 = A and B 2 = Ac, we have P [S] = P [A ∪ Ac] = P [A] + P [Ac]. (7) Since, Axiom 2 says P [S] = 1, P [Ac] = 1 − P [A]. This proof uses Axioms 2 and 3. (c) By Theorem 1.2, we can write both A and B as unions of disjoint events: A = (AB) ∪ (ABc) B = (AB) ∪ (AcB). (8) Now we apply Theorem 1.4 to write P [A] = P [AB] + P [ABc] , P [B] = P [AB] + P [AcB]. (9) We can rewrite these facts as P [ABc] = P [A] − P [AB], P [AcB] = P [B] − P [AB]. (10) Note that so far we have used only Axiom 3. Finally, we observe that A ∪ B can be written as the union of mutually exclusive events A ∪ B = (AB) ∪ (ABc) ∪ (AcB). (11) Once again, using Theorem 1.4, we have P [A ∪ B] = P [AB] + P [ABc] + P [AcB] (12) Substituting the results of Equation (10) into Equation (12) yields P [A ∪ B] = P [AB] + P [A] − P [AB] + P [B] − P [AB] , (13) which completes the proof. Note that this claim required only Axiom 3. (d) Observe that since A ⊂ B, we can write B as the disjoint union B = A ∪ (AcB). By Theorem 1.4 (which uses Axiom 3), P [B] = P [A] + P [AcB]. (14) By Axiom 1, P [AcB] ≥ 0, hich implies P [A] ≤ P [B]. This proof uses Axioms 1 and 3.
Each question requests a conditional probability.
(a) Note that the probability a call is brief is P [B] = P [H 0 B] + P [H 1 B] + P [H 2 B] = 0. 6. (1) The probability a brief call will have no handoffs is
P [H 0 |B] =
(b) The probability of one handoff is P [H 1 ] = P [H 1 B] + P [H 1 L] = 0.2. The probability that a call with one handoff will be long is
P [L|H 1 ] =
(c) The probability a call is long is P [L] = 1 − P [B] = 0.4. The probability that a long call will have one or more handoffs is
P [H 1 ∪ H 2 |L] =
The sample outcomes can be written ijk where the first card drawn is i, the second is j and the third is k. The sample space is
S = { 234 , 243 , 324 , 342 , 423 , 432 }. (1)
and each of the six outcomes has probability 1/6. The events E 1 , E 2 , E 3 , O 1 , O 2 , O 3 are
E 1 = { 234 , 243 , 423 , 432 } , O 1 = { 324 , 342 } , (2) E 2 = { 243 , 324 , 342 , 423 } , O 2 = { 234 , 432 } , (3) E 3 = { 234 , 324 , 342 , 432 } , O 3 = { 243 , 423 }. (4) (a) The conditional probability the second card is even given that the first card is even is
P [E 2 |E 1 ] =
(b) The conditional probability the first card is even given that the second card is even is
P [E 1 |E 2 ] =
(c) The probability the first two cards are even given the third card is even is
P [E 1 E 2 |E 3 ] =
(d) The conditional probabilities the second card is even given that the first card is odd is
P [E 2 |O 1 ] =
(e) The conditional probability the second card is odd given that the first card is odd is
P [O 2 |O 1 ] =
The problem statement yields the obvious facts that P [L] = 0.16 and P [H] = 0.10. The words “10% of the ticks that had either Lyme disease or HGE carried both diseases” can be written as
P [LH|L ∪ H] = 0. 10. (1) (a) Since LH ⊂ L ∪ H,
P [LH|L ∪ H] =
Thus, P [LH] = 0. 10 P [L ∪ H] = 0.10 (P [L] + P [H] − P [LH]). (3) Since P [L] = 0.16 and P [H] = 0.10,
P [LH] =
(b) The conditional probability that a tick has HGE given that it has Lyme disease is
This problems asks whether A and B can be independent events yet satisfy A = B? By definition, events A and B are independent if and only if P [AB] = P [A]P [B]. We can see that if A = B, that is they are the same set, then
P [AB] = P [AA] = P [A] = P [B]. (1)
Thus, for A and B to be the same set and also independent,
P [A] = P [AB] = P [A] P [B] = (P [A])^2. (2)
There are two ways that this requirement can be satisfied:
In the Venn diagram, assume the sample space has area 1 correspond- ing to probability 1. As drawn, both A and B have area 1/4 so that P [A] = P [B] = 1/4. Moreover, the intersection AB has area 1/ 16 and covers 1/4 of A and 1/4 of B. That is, A and B are independent since P [AB] = P [A] P [B]. (1)
(a) Since A and B are disjoint, P [A ∩ B] = 0. Since P [A ∩ B] = 0,
P [A ∪ B] = P [A] + P [B] − P [A ∩ B] = 3/ 8. (1)
A Venn diagram should convince you that A ⊂ Bc^ so that A ∩ Bc^ = A. This implies
P [A ∩ Bc] = P [A] = 1/ 4. (2)
It also follows that P [A ∪ Bc] = P [Bc] = 1 − 1 /8 = 7/8.
(b) Events A and B are dependent since P [AB] = P [A]P [B].
For a sample space S = { 1 , 2 , 3 , 4 } with equiprobable outcomes, consider the events
A 1 = { 1 , 2 } A 2 = { 2 , 3 } A 3 = { 3 , 1 }. (1)
Each event Ai has probability 1/2. Moreover, each pair of events is independent since
P [A 1 A 2 ] = P [A 2 A 3 ] = P [A 3 A 1 ] = 1/ 4. (2)
However, the three events A 1 , A 2 , A 3 are not independent since
P [A 1 A 2 A 3 ] = 0 = P [A 1 ] P [A 2 ] P [A 3 ]. (3)
There are 16 distinct equally likely outcomes for the second generation of pea plants based on a first generation of {rwyg, rwgy, wryg, wrgy}. They are listed below
rryy rryg rrgy rrgg rwyy rwyg rwgy rwgg wryy wryg wrgy wrgg wwyy wwyg wwgy wwgg
A plant has yellow seeds, that is event Y occurs, if a plant has at least one dominant y gene. Except for the four outcomes with a pair of recessive g genes, the remaining 12 outcomes have yellow seeds. From the above, we see that P [Y ] = 12/16 = 3/ 4 (2)
and P [R] = 12/16 = 3/ 4. (3) To find the conditional probabilities P [R|Y ] and P [Y |R], we first must find P [RY ]. Note that RY , the event that a plant has rounded yellow seeds, is the set of outcomes
RY = {rryy, rryg, rrgy, rwyy, rwyg, rwgy, wryy, wryg, wrgy}. (4)
Since P [RY ] = 9/16,
P [Y |R ] =
and
P [R |Y ] =
Thus P [R|Y ] = P [R] and P [Y |R] = P [Y ] and R and Y are independent events. There are four visibly different pea plants, corresponding to whether the peas are round (R) or not (Rc), or yellow (Y ) or not (Y c). These four visible events have probabilities
P [RY ] = 9/ 16 P [RY c] = 3/ 16 , (7) P [RcY ] = 3/ 16 P [RcY c] = 1/ 16. (8)
(a) For any events A and B, we can write the law of total probability in the form of
P [A] = P [AB] + P [ABc]. (1)
Since A and B are independent, P [AB] = P [A]P [B]. This implies
P [ABc] = P [A] − P [A] P [B] = P [A] (1 − P [B]) = P [A] P [Bc]. (2)
Thus A and Bc^ are independent.
(b) Proving that Ac^ and B are independent is not really necessary. Since A and B are arbitrary labels, it is really the same claim as in part (a). That is, simply reversing the labels of A and B proves the claim. Alternatively, one can construct exactly the same proof as in part (a) with the labels A and B reversed.
(c) To prove that Ac^ and Bc^ are independent, we apply the result of part (a) to the sets A and Bc. Since we know from part (a) that A and Bc^ are independent, part (b) says that Ac^ and Bc^ are independent.
A (^) AC
AB ABC C
B BC
In the Venn diagram at right, assume the sample space has area 1 cor- responding to probability 1. As drawn, A, B, and C each have area 1/ 2 and thus probability 1/2. Moreover, the three way intersection ABC has probability 1/8. Thus A, B, and C are mutually independent since
P [ABC] = P [A] P [B] P [C]. (1)
A AB B
AC C BC
In the Venn diagram at right, assume the sample space has area 1 cor- responding to probability 1. As drawn, A, B, and C each have area 1 /3 and thus probability 1/3. The three way intersection ABC has zero probability, implying A, B, and C are not mutually independent since
P [ABC] = 0 = P [A] P [B] P [C]. (1)
However, AB, BC, and AC each has area 1/9. As a result, each pair of events is independent since P [AB] = P [A] P [B] , P [BC] = P [B] P [C] , P [AC] = P [A] P [C]. (2)
1 / (^2) G^1
(^) B 1 / (^21)
3 / (^4) G 2 1 / 4 ^ B 2
1 / (^4) G 2 3 / 4 ^ B 2
The game goes into overtime if exactly one free throw is made. This event has probability
P [O] = P [G 1 B 2 ] + P [B 1 G 2 ] = 1/8 + 1/8 = 1/ 4. (1)
The tree for this experiment is
1 / (^2) (^) A 1 / 2 ^ B
1 / (^4) H 3 / 4 T
^3 /^4 H 1 / 4 ^ T
The probability that you guess correctly is
P [C] = P [AT ] + P [BH] = 3/8 + 3/8 = 3/ 4. (1)
The P [− |H ] is the probability that a person who has HIV tests negative for the disease. This is referred to as a false-negative result. The case where a person who does not have HIV but tests positive for the disease, is called a false-positive result and has probability P [+|Hc]. Since the test is correct 99% of the time, P [−|H] = P [+|Hc] = 0. 01. (1)
Now the probability that a person who has tested positive for HIV actually has the disease is
P [H|+] =
P [+, H] + P [+, Hc]
We can use Bayes’ formula to evaluate these joint probabilities.
P [H|+] =
P [+|H] P [H] + P [+|Hc] P [Hc]
Thus, even though the test is correct 99% of the time, the probability that a random person who tests positive actually has HIV is less than 0.02. The reason this probability is so low is that the a priori probability that a person has HIV is very small.
Let Ai and Di indicate whether the ith photodetector is acceptable or defective.
3 / (^5) A 1 2 / 5 ^ D 1
4 / (^5) A 2 1 / 5 D^2
^2 /^5 A 2 3 / 5 ^ D 2
(a) We wish to find the probability P [E 1 ] that exactly one photodetector is acceptable. From the tree, we have
P [E 1 ] = P [A 1 D 2 ] + P [D 1 A 2 ] = 3/25 + 4/25 = 7/ 25. (1)
(b) The probability that both photodetectors are defective is P [D 1 D 2 ] = 6/25.
The tree for this experiment is
1 / (^2) A^1
(^) B 1 / (^21)
1 / (^4) H 1 3 / 4 T^1
^3 /^4 H 1 1 / 4 ^ T 1
3 / (^4) H 2 1 / 4 T^2 ^3 /^4 H 2 1 / 4 ^ T 2
1 / (^4) H 2 3 / 4 T^2 ^1 /^4 H 2 3 / 4 ^ T 2
The event H 1 H 2 that heads occurs on both flips has probability
P [H 1 H 2 ] = P [A 1 H 1 H 2 ] + P [B 1 H 1 H 2 ] = 6/ 32. (1)
The probability of H 1 is
P [H 1 ] = P [A 1 H 1 H 2 ] + P [A 1 H 1 T 2 ] + P [B 1 H 1 H 2 ] + P [B 1 H 1 T 2 ] = 1/ 2. (2)
Similarly,
P [H 2 ] = P [A 1 H 1 H 2 ] + P [A 1 T 1 H 2 ] + P [B 1 H 1 H 2 ] + P [B 1 T 1 H 2 ] = 1/ 2. (3)
Thus P [H 1 H 2 ] = P [H 1 ]P [H 2 ], implying H 1 and H 2 are not independent. This result should not be surprising since if the first flip is heads, it is likely that coin B was picked first. In this case, the second flip is less likely to be heads since it becomes more likely that the second coin flipped was coin A.