Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Calculating Expected Heads in Coin Flips & Maxima in Permutations, Study notes of Advanced Calculus

An explanation of Linearity of Expectation, a fundamental concept in probability theory. It illustrates the concept using examples of calculating the expected number of heads when flipping multiple coins and the expected number of left-to-right maxima in a random permutation. The document also discusses generating random permutations and the relationship between Linearity of Expectation and certain algorithms.

What you will learn

  • What is the probability that a number in a permutation is a left-to-right maxima?
  • How can a random permutation be generated using Linearity of Expectation?
  • How does Linearity of Expectation apply to the calculation of the expected number of heads in multiple coin flips?
  • What is the expected number of left-to-right maxima in a random permutation of n numbers?
  • How does Linearity of Expectation relate to the run time of certain algorithms?

Typology: Study notes

2021/2022

Uploaded on 09/12/2022

leonpan
leonpan 🇺🇸

4

(12)

286 documents

1 / 16

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
Quick Review of Linearity of
Expectation
COMP 3711H - HKUST
Version of 24/11/2014
M. J. Golin
pf3
pf4
pf5
pf8
pf9
pfa
pfd
pfe
pff

Partial preview of the text

Download Calculating Expected Heads in Coin Flips & Maxima in Permutations and more Study notes Advanced Calculus in PDF only on Docsity!

Quick Review of Linearity of

Expectation

COMP 3711H - HKUST Version of 24/11/ M. J. Golin

Linearity of Expectation is one of the simplest and most useful tools used in the analysis of randomized algorithms. In its easiest form it just says that, if X, Y are any two random variables (not necessarily independent) then E(X + Y ) = E(X) + E(Y ). The iterated version is that if X 1 , X 2 ,... , Xn are any random variables, then

E

( (^) n ∑ i= Xi

∑^ n i= E(Xi).

Example: Let Z be the value seen when rolling two dice. Z = X 1 + X 2 where Xi is the value seen when rolling single die i = 1, 2. It’s easy to calculate that E(Xi) =

∑^6

j= j Pr(Xi = j) =

∑^6

j= j

Then

E(Z) = E(X 1 + X 2 ) = E(X 1 ) + E(X 2 ) =

7 2

7 2 = 7.

When flipping n coins, what is the expected number of heads? Z =

i=1 Xi, where Xi = 1 if coin i is a head and 0 if it is a tail. Set Pr(Xi = 1) = pi and Pr(Xi = 0) = 1 − pi. Then Xi is a Bernoulli Random Variable with probability pi. Note that E(Xi) = 1 · Pr(Xi = 1) = pi so E(Z) = ∑^ n i= E(Xi) = ∑^ n i= Pr(Xi = 1) = ∑^ n i= pi.

When flipping n coins, what is the expected number of heads? Z =

i=1 Xi, where Xi = 1 if coin i is a head and 0 if it is a tail. Set Pr(Xi = 1) = pi and Pr(Xi = 0) = 1 − pi. Then Xi is a Bernoulli Random Variable with probability pi.

Note that E(Xi) = 1 · Pr(Xi = 1) = pi so

E(Z) =

∑^ n

i=

E(Xi) =

∑^ n

i=

Pr(Xi = 1) =

∑^ n

i=

pi.

Examples:

pi = p (all coins the same) ⇒ E(Z) = pn

pi = (^1) i ⇒ E(Z) =

∑n i=1 pi^ =^

∑n i=

1 n =^ Hn^ ∼^ ln^ n

Suppose you are flipping n coins, each with pi = 12 , i.e., fair coins. How many times does the pattern HHH appear?

Suppose an algorithm’s input is a permutation of n numbers. Let x 1 , x 2 ,... , xn be the input in its given order.

xi is a left to right maxima if it’s bigger than x 1 , x 2 ,... , xi− 1.

For example, the red items in these two permutations are the l.t.r. maxima: 5 4 7 8 1 6 3 2 1 3 5 7 2 4 6 8

Suppose an algorithm’s input is a permutation of n numbers. Let x 1 , x 2 ,... , xn be the input in its given order. xi is a left to right maxima if it’s bigger than x 1 , x 2 ,... , xi− 1. For example, the red items in these two permutations are the l.t.r. maxima: 5 4 7 8 1 6 3 2 1 3 5 7 2 4 6 8 Some algorithms’ run times depend upon Z, the number of l.t.r. maxima. Assuming all n! permutations are equally likely, how can we find E(Z)?

Suppose an algorithm’s input is a permutation of n numbers. Let x 1 , x 2 ,... , xn be the input in its given order. xi is a left to right maxima if it’s bigger than x 1 , x 2 ,... , xi− 1. For example, the red items in these two permutations are the l.t.r. maxima: 5 4 7 8 1 6 3 2 1 3 5 7 2 4 6 8 Z = ∑n i=1 Xi^ where^ Xi^ = 1^ iff^ xi^ is a l.t.r. maxima and^0 otherwise One way of generating a random permutation is to first randomly choose the first i items equally likely among all posible (n i ) subsets. Then choose a random permutation among the i possible permutations to order them as x 1 ,... xi. Then randomly order the remaining items as xi+1,... xn. Some algorithms’ run times depend upon Z, the number of l.t.r. maxima. Assuming all n! permutations are equally likely, how can we find E(Z)?

Suppose an algorithm’s input is a permutation of n numbers. Let x 1 , x 2 ,... , xn be the input in its given order. xi is a left to right maxima if it’s bigger than x 1 , x 2 ,... , xi− 1. For example, the red items in these two permutations are the l.t.r. maxima: 5 4 7 8 1 6 3 2 1 3 5 7 2 4 6 8 Z = ∑n i=1 Xi^ where^ Xi^ = 1^ iff^ xi^ is a l.t.r. maxima and^0 otherwise One way of generating a random permutation is to first randomly choose the first i items equally likely among all posible (n i ) subsets. Then choose a random permutation among the i possible permutations to order them as x 1 ,... xi. Then randomly order the remaining items as xi+1,... xn. Probability xi is l.t.r. maxima is prob it’s largest in first i items which is now 1 i. So^ Xi^ is Bernouli Random Variable with^ pi^ = 1/i^ and^ E(Zi) =^ Hn. Some algorithms’ run times depend upon Z, the number of l.t.r. maxima. Assuming all n! permutations are equally likely, how can we find E(Z)?

Make a minor change from the previous page. Suppose we flip n coins. ith coin is Heads with probability pi = 1/i. If ith coin is Heads you run another random process Yi to tell you how much money you receive. All you know is that E(Yi) = i. If ith coin is Tails, you get nothing. What is expected amount you receive? Let Z be total amount. Z = ∑n i=1 XiYi^ where Xi = 1 if ith coin is Heads and is otherwise 0 , so E(Xi) = pi. E(Yi) = i. Then, because Xi and Yi are independent E(XiYi) = E(Xi)E(Yi) =^1 i i = 1 E(Z) = E ( (^) n ∑ i= XiYi ) = ∑^ n i= E(XiYi) = ∑^ n i= So, 1 = n