Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Probability Theory: Understanding Probabilities, Sample Spaces, and Events in Statistics -, Papers of Statistics

An introduction to probability theory, explaining the concept of probability, sample spaces, and the difference between probability and uncertainty. It covers keynes' distinction between probability and uncertainty, the rules for probabilities, and methods for determining the probability of an event. The document also introduces the classical method, counting rules, tree diagrams, and factorials, as well as combinations and permutations.

Typology: Papers

Pre 2010

Uploaded on 08/18/2009

koofers-user-9dr
koofers-user-9dr 🇺🇸

10 documents

1 / 14

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
Econ 5
Introduction to Statistics
Asat ar Bair, Ph.D.
Departmen t of Economics
City College o f San Francisc o
aba ir@ccsf.edu
Lectures on Chapter 4
Probability Theory
Probability is the chance or likelihood of an
event occurring;
a probability is always between 0 and 1
inclusive;
probabilities of 0 or 1 repre sent certainty;
all numbers in between are uncertain, but an
event with a probability of 0.9 is more likely
to occur than an event with a probability of
0.1.
Sample Space
The sample space is the set of all the possible
outcomes of a given experiment or situation;
therefore there are defined limits to our
uncertainty;
we know what the range of possibilities are;
if we do not have this information, we cannot
calculate a probabilit y.
The British economist J.M.
Keynes makes the distinction
between probability and
uncertainty in his 1921 essay,
A Treatise on Probabili ty”;
Probabilit y vs.
Uncertainty
Keynes argues uncertainty is when a probabilit y
cannot be calculated;
further, he argues this situation is t ypical of most
important business decisions regarding inve stment.
pf3
pf4
pf5
pf8
pf9
pfa
pfd
pfe

Partial preview of the text

Download Probability Theory: Understanding Probabilities, Sample Spaces, and Events in Statistics - and more Papers Statistics in PDF only on Docsity!

Econ 5

Introduction to Statistics

Asatar Bair, Ph.D. Department of Economics City College of San Francisco abair@ccsf.edu Lectures on Chapter 4

Probability Theory

Probability is the chance or likelihood of an event occurring; a probability is always bet ween 0 and 1 inclusive; probabilities of 0 or 1 represent certainty; all numbers in bet ween are uncertain, but an event with a probability of 0.9 is more likely to occur than an event with a probability of

Sample Space

The sample space is the set of all the possible outcomes of a given experiment or situation; therefore there are defined limits to our uncertainty; we know what the range of possibilities are; if we do not have this information, we cannot calculate a probability. The British economist J.M. Keynes makes the distinction bet ween probability and uncertainty in his 1921 essay, “A Treatise on Probability”; Probability vs. Uncertainty Keynes argues uncertainty is when a probability cannot be calculated; further, he argues this situation is typical of most important business decisions regarding investment.

J.M. Keynes (1935) The General Theory of Employment, Interest, and Money. The outstanding fact is the extreme precariousness of the basis of knowledge on which our estimates of prospective yield have to be made. Our knowledge of the factors which will govern the yield of an investment some years hence is usually very slight and often negligible. If we speak frankly, we have to admit that our basis of knowledge for estimating the yield ten years hence of a railway, a copper mine, a textile factory, the goodwill of a patent medicine, an Atlantic liner, a building in the City of London amounts to little and sometimes to nothing; or even five years hence. Optional

Sample Space

The sample space for rolling a die is written: S = {1, 2, 3, 4, 5, 6} Rules for probabilities The probability of an event, E, is defined as P(E); There are t wo requirements for a probability: it must be bet ween 0 and 1 inclusive; the sum of all possible events in the sample space must be 1. Determining the probability of an event There are 3 methods of determining probability: The classical method: we assume all events are equally likely, so for n events in the sample space, the probability of a given event would be 1/ n ; the relative frequency method: to use the relative frequency of the event, calculated to be f/n ; and the subjective method: how likely one believes an event to be (again, on a scale of 0 to 1).

Mathematics: Factorials Factorials are very helpful in probability theory; a factorial is a successive multiplication; the symbol used is “!”; e.g. 4! = 432*1 = 24; factorials increase in size very quickly;. by convention, 0! = 1. Combinations The formula for combinations is used in a situation where the order is not important; ! C

n

N

= N n "

$ % & ' =^ N! n !( N ( n )! P

n

N

= n! N n "

$ % & ' =^ N! ( N ( n )! Combinations e.g. The Ohio lottery uses 6 numbers from a group of 47. How many combinations are there? ! C 647 = 47 6 "

$ % & ' =^ 47! 6 !( 47 ( 6 )! = 47! 6! 41! = 47 ) 46 ) 45 ) 44 ) 43 ) 42 ) 41! 6 !) 41! = 47 ) 46 ) 45 ) 44 ) 43 ) 42 6 ) 5 ) 4 ) 3 ) 2 ) 1 = 10 , 737 , 573 Permutations The formula for combinations is used in a situation where the order is important; notice that the formula is similar to the one for combinations. C

n

N

= N n "

$ % & ' =^ N! n !( N ( n )! P

n

N

= n! N n "

$ % & ' =^ N! ( N ( n )!

Permutations e.g. you’re a thief trying to break a combination lock (it should be called a “permutation lock”) -- the lock has 5 digits that go from 0 to 9. How many permutations are there? P 5 10 = 10 5 "

$ % & ' =^ 10! ( 10 ( 5)! = 10! 5! = 10 ) 9 ) 8 ) 7 ) 6 ) 5! 5! = 10 ) 9 ) 8 ) 7 ) 6 = 30 , 240 Events and complements If P(A) is the probability of event A, we define P(Ac) as the probability that A does not occur, or the complement of A; by definition, P(A) + P(Ac) = 1, therefore 1 - P(Ac) = P(A) and 1 - P(A) = P(Ac). Venn Diagram A diagram that represents events and their complements; the area represents the probability; the total area is the sample space, P(S) = 1 A Ac Event A Complement of Event A Sample space S Venn Diagram The diagram can be used to describe many events A Event A

B

Event B

Mutual Exclusivity If t wo events have no sample points in common, they are mutually exclusive; this implies P(A B) = 0; in this case, P(A B) = P(A) + P(B) A Event A

B

Event B Conditional probability Events often depend on whether or not some other event has occurred; The probability of A given B is a conditional probability; it is written: P(A | B) Conditional probability We can use a joint probability table to help us understand and calculate the conditional probability; the book uses the example of possible discrimination in promotion at a police department (p. 162- 163 ), defining M = event that an officer is a man; W = event that an officer is a woman; A = officer is promoted; Ac^ = officer is not promoted. Promotion and gender of officers Men Women Total Promoted (A) 288 36 324 Not promoted (Ac) 672 204 876 Total 960 240 1200

Joint probability table Men Women Total Promoted (A) 288 / 1200 36 / 1200 324 / 1200 Not promoted (Ac) 672 / 1200 204 / 1200 876 / 1200 Total 960 / 1200 240 / 1200 1200 / 1200 = joint probabilities = marginal probabilities Joint probability table Men Women Total Promoted (A) 0. 24 0. 03 0. 27 Not promoted (Ac) 0. 56 0. 17 0. 73 Total 0 .80 0. 20 1. 00 = joint probabilities = marginal probabilities Joint probability table Men Women Total Promoted (A) P(M A) P(W A) P(A) Not promoted (Ac) P(M Ac) P(W Ac) P(Ac) Total P(M) P(W) 1. 00 = joint probabilities = marginal probabilities Calculating conditional probability We want the probability of being promoted given the officer is male, so we look for the number of male officers promoted (288) relative to the total of male officers ( 960 ), which gives us: P(A | M) = 288/960 = 0.

Law of Conditional Probability P(A | B) = A! B P(B) P(A B) Multiplication Law P(A B) = P(B)* P(A | B) P(A B) = P(A)* P(B | A) or The multiplication law is derived from the law of conditional probability. Independent events Events are independent if: P(A | B) = P(A) P(B | A) = P(B) or Bayes’ Theorem

Reverend Thomas Bayes

Bayes’ Theorem The theorem has to do with how new information may be used to update our probabilities; this creates t wo categories: prior probabilities, which we created based on our old knowledge; and posterior probabilities, which we created based on the new knowledge. Not all mathematicians and statisticians subscribe to the Bayesian approach; it has created different camps, Bayesians and non-Bayesians. Probability revision Prior probabilities New information Use of Bayes’ Theorem Posterior probabilities Example: Part Suppliers Say a firm receives parts from 2 different suppliers. A 1 is the event that the part is from supplier 1 and A 2 is the event that the part is from supplier 2. P(A 1 ) = 0. 65 (P(A 2 ) = 0. B B G G A 1 A 2 Step 1: Supplier Step 2: Condition Experimental Outcome (A 1 , G) (A 1 , B) (A 2 , G) (A 2 , B) Tree Diagram

Bayes’ Theorem (2 event case)

P(A 1 | B) =

P(A 1 )*P(B | A 1 )

P(A 1 )P(B | A 1 ) + P(A 2 )P(B | A 2 )

P(A 2 | B) =

P(A 2 )*P(B | A 2 )

P(A 1 )P(B | A 1 ) + P(A 2 )P(B | A 2 )

Applying Bayes’ Theorem:

P(A 1 | B) =

P(A 1 )*P(B | A 1 )

P(A 1 )P(B | A 1 ) + P(A 2 )P(B | A 2 )

P(A 1 | B) =

Applying Bayes’ Theorem:

P(A 2 | B) =

P(A 2 )*P(B | A 2 )

P(A 1 )P(B | A 1 ) + P(A 2 )P(B | A 2 )

P(A 2 | B) =

Result: an updated probability We started out thinking the probability that a part came from supplier 1 was 0. 65 ; but since we know the part is bad, and bad parts are 2.5 times more likely to come from supplier 2, ( 0 .05 vs. 0. 02 ), if we know a part is bad, we can update the probability to 0.4262 that the bad part came from supplier 1, i.e. P(A 1 | B) = 0. 4262.

An easy way to apply Bayes’ Theorem is to use a table. Probability Event Ai Prior P(Ai) Conditional P(Ai | B) Joint P(Ai B) Posterior P(B | Ai) A 1 0.65 0.02 0. 0.0130/0.0305 = 0. A 2 0.35 0.05 0. 0.0175/0.0305 = 0. 1.00 0.0305 1.