Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Independence of Events in Probability Theory, Study notes of Statistics

The concept of independent events in probability theory and provides examples to illustrate the concept. Independent events are those for which knowing the occurrence or non-occurrence of one event does not affect the probability of the other event. The document also discusses the distinction between independence and mutual exclusivity and provides examples of systems of components and calculating probabilities using independence assumptions.

Typology: Study notes

2021/2022

Uploaded on 09/12/2022

kalia
kalia 🇺🇸

4

(7)

239 documents

1 / 10

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
Events Aand Bare independent if:
knowing whether Aoccured does not change the probability of B.
Mathematically, can say in two equivalent ways:
P(B|A)=P(B)
P(Aand B)=P(BA)=P(B)×P(A).
Important to distinguish independence from mutually exclusive which
would say BAis empty (cannot happen).
Example. Deal 2 cards from deck
AfirstcardisAce
C second card is Ace
P(C|A)= 3
51
P(C)= 4
52 (last class).
So Aand Care dependent.
pf3
pf4
pf5
pf8
pf9
pfa

Partial preview of the text

Download Independence of Events in Probability Theory and more Study notes Statistics in PDF only on Docsity!

Events A and B are independent if: knowing whether A occured does not change the probability of B. Mathematically, can say in two equivalent ways: P(B|A) = P(B) P(A and B) = P(B ∩ A) = P(B) × P(A). Important to distinguish independence from mutually exclusive which would say B ∩ A is empty (cannot happen). Example. Deal 2 cards from deck A first card is Ace C second card is Ace P(C |A) = 3 51 P(C ) = 4 52 (last class). So A and C are dependent.

Example. Throw 2 dice A first die lands 1 B second die shows larger number than first die C both dice show same number P(B|A) = 5 6 P(B) =? =^ 15 36 by counting so A and B dependent. P(C |A) = 1 6 P(C^ ) =^ 6 36 =^ 1 6 so A and C independent. Note 1: here B and C are mutually exclusive. Note 2: writing B ￿ = ”second die shows smaller number than first die ” we have P(B ￿ ) = P(B) by symmetry P(B ∪ B ￿ ) = P(C c ) = 1 − P(C ) = 5 6 giving a “non-counting” argument that P(B) = 5 12

(silly) Example. Throw 2 dice. If sum is at least 7 I show you the dice; if not, I don’t. A: I show you first die lands 1 B: I show you second die lands 1 P(A) = 1 36

, P(B) =

1 36

, P(A ∩ B) = 0

so A and B dependent. Conceptual point. This illustrates a subtle point: being told by a truthful person that “A happened” is not (for probability/statistics purposes) exactly the same as “knowing A happened”. [car accident example]

Systems of components Will show logic diagrams: system works if there is some path left-to-right which passes only though working components. Assume components work/fail independently, P(Ci works ) = pi , P(Ci fails ) = 1 − pi. Note in practice the independence assumption is usually unrealistic. Math question: calculate P( system works ) in terms of the numbers pi and the network structure.

More complicated example: [picture on board] We could write out all 16 combinations; instead let’s condition on whether or not C 1 works. P(system works) = P(system works|C 1 works)P(C 1 works)

  • P(system works|C 1 fails)P(C 1 fails) [continue on board]

Example: Deal 4 cards. What is chance we get exactly one Spade? event 1st 2nd 3rd 4th F 1 S N N N F 2 N S N N F 3 F 4 N N N S [board: repeated conditioning] P(F 1 ) =

×

×

×

P(F 1 ) = P(F 2 ) = P(F 3 ) = P(F 4 )

P(exactly one Spade) = P(F 1 or F 2 or F 3 or F 4 )) = P(F 1 ) + P(F 2 ) + P(F 3 ) + P(F 4 ) = 4 × P(F 1 ) ≈ 44%.

Bayes rule: updating probabilities as new information is acquired. (silly) Example There are 2 coins: one is fair: P(Heads) = 1/2; one is biased: P(Heads) = 9/ 10 Pick one coin at random. Toss 3 times. Suppose we get 3 Heads. What then is the chance that the coin we picked is the biased coin? Abstract set-up: Partition (B 1 , B 2 ,.. .) of “alternate possibilities”. Know prior probabilities P(Bi ). Then observe some event A happens (the “new information”) for which we know P(A|Bi ). We want to calculate the posterior probabilities P(Bi |A). Bayes formula: P(Bi |A) = P(A|Bi )P(Bi ) P(A|B 1 )P(B 1 ) + P(A|B 2 )P(B 2 ) +...

[example above on board]