




Study with the several resources on Docsity
Earn points by helping other students or get them with a premium plan
Prepare for your exams
Study with the several resources on Docsity
Earn points to download
Earn points by helping other students or get them with a premium plan
Community
Ask the community for help and clear up your study doubts
Discover the best universities in your country according to Docsity users
Free resources
Download our free guides on studying techniques, anxiety management strategies, and thesis advice from Docsity tutors
Sir Tanika Mukopadhyay taught us Probability at Homi Bhabha National Institute. He gave us assignments so that we can practice what we learned in form of problems. Here is solution to those problems. Its main emphasis is on following points: Occupy, Desks, Sequence, Choices, Therefore, Lottery, Contanining, Balls, Numbered, Winning, Defective, Accepting
Typology: Exercises
1 / 8
This page cannot be seen from the preview
Don't miss anything!
EE 5375/7375 Random Processes September 2, 2003
Homework #1 Solutions
Problem 1. textbook problem 2. In how many ways can 10 students occupy 10 desks? 12 desks?
Imagine the sequence of students selecting desks. When there are 10 desks, the first student has 10 choices, the second student 9 choices, and so on. Therefore 10 students can fill 10 desks in 10! = 3, 628 , 800 ways. If there are 12 desks, the first student has 12 choices, the second student has 11 choices, and so on. Therefore 10 students can fill 12 desks in 12 × 11 × · · · × 3 = 239, 500 , 800 ways.
Problem 2. textbook problem 2. You win a lottery if you correctly predict the numbers of 6 balls drawn from an urn containing balls numbered 1, 2,..., 49, without replacement and without regard to ordering. What is the probability of winning if you buy one ticket?
There are
6
= 13, 983 , 816 ways of choosing 6 balls randomly out of 49 balls. Each of these ways is equally likely, so the chance of winning the lottery with one ticket is 1/ 13 , 983 , 816 = 7. 2 × 10 −^8.
Problem 3. textbook problem 2. In each lot of 100 items, 2 items are tested, and the lot is rejected if either of the tested items is found defective. (a) Find the probability of accepting a lot with 5 defective items. Repeat for 10 defective items. (b) Recompute the probabilities in part (a) if 3 items are tested, and a lot is accepted when at most 1 of the 3 tested items is found defective.
(a) Define A 1 as the event that the first item is non-defective and A 2 as the event that the second item is non-defective. P (lot accepted) = P (A 1 ∩ A 2 ) = P (A 2 |A 1 )P (A 1 )
With 5 defective items and 95 non-defective items in the lot,
Given that the first item is non-defective, there are only 94 non-defective items left in the lot for the second test. Hence
P (A 2 |A 1 ) =
Combining,
P (lot accepted) =
If there are 10 defective and 90 non-defective items in the lot, similar reasoning leads to
P (lot accepted) = P (A 1 ∩ A 2 ) = P (A 2 |A 1 )P (A 1 )
=
(b) Define the additional event A 3 as the event that the third item is non-defective. If a lot is accepted when at most 1 of the 3 tested items is defective, then the lot is accepted in any of 4 events: (A 1 ∩ A 2 ∩ A 3 ) or (Ac 1 ∩ A 2 ∩ A 3 ) or (A 1 ∩ Ac 2 ∩ A 3 ) or (A 1 ∩ A 2 ∩ Ac 3 ).
P (lot accepted) = P (A 1 ∩ A 2 ∩ A 3 ) + P (Ac 1 ∩ A 2 ∩ A 3 ) + P (A 1 ∩ Ac 2 ∩ A 3 ) + P (A 1 ∩ A 2 ∩ Ac 3 )
1
We can use conditional probabilities to write
P (A 1 ∩ A 2 ∩ A 3 ) = P (A 3 |A 1 ∩ A 2 )P (A 2 |A 1 )P (A 1 )
Similarly, P (Ac 1 ∩ A 2 ∩ A 3 ) = P (A 3 |Ac 1 ∩ A 2 )P (A 2 |Ac 1 )P (Ac 1 ) P (A 1 ∩ Ac 2 ∩ A 3 ) = P (A 3 |A 1 ∩ Ac 2 )P (Ac 2 |A 1 )P (A 1 ) P (A 1 ∩ A 2 ∩ Ac 3 ) = P (Ac 3 |A 1 ∩ A 2 )P (A 2 |A 1 )P (A 1 )
With 5 defective items and 95 non-defective items in the lot,
P (A 1 ) =
Filling in the other probabilities in a similar way,
P (lot accepted) =
If there are 10 defective and 90 non-defective items in the lot,
P (lot accepted) =
Problem 4. textbook problem 2. A nonsymmetric binary communications channel is shown in Fig. P2.2. Assume the inputs are equiprobable. (a) Find the probability that the output is 0. (b) Find the probability that the input was 0 given that the input is 1. Find the probability that the input is 1 given that the output is a 1. Which input is more probable?
(a) We can use the theorem on total probability to write
P (Y = 0) = P (Y = 0|X = 0)P (X = 0) + P (Y = 0|X = 1)P (X = 1)
=
(b) We can use Bayes’ rule
1 2 ^1 1 − 12 (1 − 1 ) − 12 2 =
1 2 (1^ −^ ^2 ) 1 − 12 (1 − 1 ) − 12 2
=
Suppose that the information sequence is produced by a sequence of independent Bernoulli trials with P (one) = P (success) = p. (a) Find the probability of runlength k in the m = 3 case. (b) Find the probability of runlength k for general m.
(a) A runlength of k corresponds to k zeros followed by a one.
P (k = 0) = p P (k = 1) = (1 − p)p P (k = 2) = (1 − p)^2 p P (k = 3) = 1 − P (k = 0) − P (k = 1) − P (k = 2) = (1 − p)^3
(b) For general m,
P (k) = (1 − p)kp for 0 ≤ k < m
P (m) = 1 −
m∑− 1
k=
P (k)
m∑− 1
k=
(1 − p)kp
= 1 − p
1 − (1 − p)m 1 − (1 − p)
= (1 − p)m
Problem 9. textbook problem 2. Suppose that in Example 2.40, computer A sends each message to computer B simultaneously over two unreliable telephone lines. Computer B can detect when errors have occured in either line. Let the prob- ability of message transmission error in line 1 and line 2 be q 1 and q 2 respectively. Computer B requests retransmissions until it receives an error-free message on either line. (a) Find the probability that more than k transmissions are required. (b) Find the probability that in the last transmission, the message on line 2 is received free of errors.
(a) The probability that both messages on the two lines are errored is q 1 q 2 , so the probability that at least one message on the two lines is error-free is (1 − q 1 q 2 ). The probability that exactly j transmissions are required is the probability of j −1 errored transmission followed by a transmission with at least one error-free messsage: P (j transmissions) = (q 1 q 2 )j−^1 (1 − q 1 q 2 )
For geometric probabilities, we have the general result that
P (more than k transmissions) = (q 1 q 2 )k
(b) Suppose m transmissions are required, then conditional probability that line 2 was error-free is
P (line 2 error-free|m transmissions) =
P (line 2 error-free and m transmissions) P (m transmissions)
=
(q 1 q 2 )m−^1 (1 − q 2 ) (q 1 q 2 )m−^1 (1 − q 1 q 2 )
=
1 − q 2 1 − q 1 q 2
Problem 10. textbook problem 3.
4
An information source produces symbols at random from a 5-letter alphabet: S = {a, b, c, d, e}. The proba- bilities of the symbols are
p(a) =
, p(b) =
, p(c) =
, p(d) = p(e) =
A data compression system encodes the letters into binary strings as follows:
a → 1 b → 01 c → 001 d → 0001 e → 0000
Let the random variable Y be equal to the length of the binary string output by the system. Specify the sample space of Y , SY , and the probabilities of its values.
Clearly, Y can take values from the discrete set SY = { 1 , 2 , 3 , 4 }.
P (Y = 1) = P ({a}) =
P (Y = 2) = P ({b}) =
P (Y = 3) = P ({c}) =
P (Y = 4) = P ({d, e}) =
Problem 11. textbook problem 3. Let X be a binomial random variable that results from the performance of n Bernoulli trials with probability of success p. (a) Suppose that X = 1. Find the probability that the single event occurred in the kth Bernoulli trial. (b) Suppose that X = 2. Find the probability that the two events occurred in the jth and kth Bernouli trials, where j < k. (c) In light of your answers to parts (a) and (b), in what sense are successes distributed “completely at random” over the n Bernoulli trials?
(a) Given that X = 1, there are n possibilities where the single success occurred: eg, in the 1st trial, 2nd trial, up to the nth trial. Each of these possibilities is equally likely. Therefore the conditional probability that the single success occurred in the kth trial is 1/n. (b) There are
(n 2
possibilities that include exactly 2 successes. Each of these is equally likely. Therefore the conditional probability that the two successes occurred in the jth and kth trials is (^) (^1 n 2 )
(c) Given X = k or k successes, there are
(n k
possibilities. Each of these is equally likely. Or in other words, the locations of the successes are equally likely among the
(n k
permutations.
Problem 12. textbook problem 3.33 (optional for EE 5375) Let X be a binomial random variable. (a) Show that
pk pk− 1
(n − k + 1)p kp
(n + 1)p − k kp
(b) Show that part (a) implies that (1) P (X = k) is maximum at kmax = [(n + 1)p], where [x] denotes the largest integer that is smaller than or equal to x; and (2) when (n + 1)p is an integer, then the maximum is achieved at kmax and kmax − 1.
5
Problem 15. Matlab (optional for EE 5375) Log into one of the SEAS computers and start Matlab (a general purpose mathematical analysis software package). After Matlab starts up, it will present a prompt where you can type input (followed by a carriage return). To get on-line information about any particular command, type “help command.” At the prompt, try to get information about a uniform random number generator rand by typing:
help rand Rand is used to generate numbers uniformly distributed between 0 and 1. Next, try to generate 100 uniform numbers by typing: x=rand(100,1); The arguments (100,1) tell rand to generate a vector of 100 random numbers. These are assigned to the vector x (the semicolon tells Matlab not to display the numbers on the screen). Look at the histogram of the numbers by typing: hist(x,10) This histogram should appear to be roughly flat because the numbers should be uniformly distributed. If you want to see what the function hist does, type: help hist Next, get information about the function binornd by typing: help binornd
Binornd is a function to generate binomially distributed random numbers. Generate some binomial(5,0.5) numbers by typing:
x=binornd(5,0.5,500,1); hist(x,10) Notice the shape of the histogram. Generate some binomial(50,0.5) numbers and then some binomial(500,0.5) numbers by typing: x=binornd(50,0.5,500,1); hist(x,10) x=binornd(500,0.5,500,1); hist(x,10) Does the last histogram look roughly bell shaped?
The binomial distribution should look approximately Gaussian (bell shaped) for large n by the DeMoivre- Laplace theorem.
Input 1-e 1
Output
1-e 2
e 2
e 1
Fig. P2.