Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Probability Theory: Assigning Probabilities, Information, and Independent Events, Study Guides, Projects, Research of Statics

An introduction to probability theory, discussing the concept of probability, assigning probabilities to events, the role of information and conditional probability, and independent events. It emphasizes the importance of understanding the rules and assumptions underlying probability calculations to avoid misconceptions and errors.

Typology: Study Guides, Projects, Research

2020/2021

Uploaded on 02/12/2021

unknown user
unknown user 🇨🇦

3 documents

1 / 6

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
1
Probability Tutorial
Probability is one of the most widely used mathematical tools, employed not only in characterizing the
statistics of experimental results, but in economic predictions, weather forecasts, sports analyses, medical
diagnoses, and political prognostication. The basic idea of assigning an event a number from 0 to 1 (or
0% to 100%), with 0 representing no chance of an event’s occurrence and 1 representing its absolute
certainty, is a numerical realization of basic intuitions about outcomes being likely or unlikely, decisions
being risky or safe, actions being bold or cautious.
However, as one of the most widely used mathematical tools, probability is also one of the most widely
misused. Failures in understanding the rules and assumptions underlying the assignment and manipulation
of probabilities have led to unjustified scientific claims, loss of life, and financial ruin. That probability
appears to dovetail so neatly with human intuition belies the fact that people’s intuitions are easily misled
into assigning probabilities that do not reflect frequencies of real events.
Conditions for Assigning Probabilities
Probability is properly employed when one can enumerate a set of all possible mutually exclusive
outcomes of some event. A coin is flipped; the outcome is heads or tails. The weather at noon tomorrow
on campus is sunny or rainy or neither sunny nor rainy. A six-sided die is rolled, and the result is 1, 3, 5,
or an even number. The “mutually exclusive” condition means that a result can fall into only one of the
enumerated categories one cannot roll a die and get both a 3 and an even number. The “all possible”
condition means that a result must fall into one of the enumerated categoriesthere are no meteorological
conditions that are not sunny, rainy, or neither sunny nor rainy. Note that there are always many sets of
possibilities one can choose to categorize an event; one could classify the outcomes of a die roll as a
prime number or not a prime number, the weather could be classified as snowing or not snowing.
Once an appropriate set has been enumerated, each outcome, say A, B, or C, can be assigned a probability
p(A), p(B), p(C), where each probability, p, is a number from 0 to 1. Given that the enumeration has listed
all possible mutually exclusive outcomes, the sum of the probabilities equals 1: p(A) + p(B) + p(C) = 1.
This equality is one of the fundamental axioms of probability:
The sum of the individual probabilities for a set of all possible mutually exclusive outcomes of an event
is 1.
While this mathematical axiom will prove useful, the individual probabilities, such as p(A), p(B), and
p(C), can only be specified by appealing to physical considerations.
pf3
pf4
pf5

Partial preview of the text

Download Probability Theory: Assigning Probabilities, Information, and Independent Events and more Study Guides, Projects, Research Statics in PDF only on Docsity!

Probability Tutorial

Probability is one of the most widely used mathematical tools, employed not only in characterizing the statistics of experimental results, but in economic predictions, weather forecasts, sports analyses, medical diagnoses, and political prognostication. The basic idea of assigning an event a number from 0 to 1 (or 0% to 100%), with 0 representing no chance of an event’s occurrence and 1 representing its absolute certainty, is a numerical realization of basic intuitions about outcomes being likely or unlikely, decisions being risky or safe, actions being bold or cautious. However, as one of the most widely used mathematical tools, probability is also one of the most widely misused. Failures in understanding the rules and assumptions underlying the assignment and manipulation of probabilities have led to unjustified scientific claims, loss of life, and financial ruin. That probability appears to dovetail so neatly with human intuition belies the fact that people’s intuitions are easily misled into assigning probabilities that do not reflect frequencies of real events. Conditions for Assigning Probabilities Probability is properly employed when one can enumerate a set of all possible mutually exclusive outcomes of some event. A coin is flipped; the outcome is heads or tails. The weather at noon tomorrow on campus is sunny or rainy or neither sunny nor rainy. A six-sided die is rolled, and the result is 1, 3, 5, or an even number. The “mutually exclusive” condition means that a result can fall into only one of the enumerated categories – one cannot roll a die and get both a 3 and an even number. The “all possible” condition means that a result must fall into one of the enumerated categories – there are no meteorological conditions that are not sunny, rainy, or neither sunny nor rainy. Note that there are always many sets of possibilities one can choose to categorize an event; one could classify the outcomes of a die roll as a prime number or not a prime number, the weather could be classified as snowing or not snowing. Once an appropriate set has been enumerated, each outcome, say A, B, or C, can be assigned a probability p(A), p(B), p(C), where each probability, p, is a number from 0 to 1. Given that the enumeration has listed all possible mutually exclusive outcomes, the sum of the probabilities equals 1: p(A) + p(B) + p(C) = 1. This equality is one of the fundamental axioms of probability: The sum of the individual probabilities for a set of all possible mutually exclusive outcomes of an event is 1. While this mathematical axiom will prove useful, the individual probabilities, such as p(A), p(B), and p(C), can only be specified by appealing to physical considerations.

Assigning Probabilities to Events The consideration that has been used most successfully in assigning probabilities to events is the identification of a physical symmetry. The bilateral symmetry of a coin means that it has no preference for landing on one side or the other. Similar considerations apply to the rotational symmetry of a die*, and to the fact that facedown playing cards all look the same. The advantage of identifying a physical symmetry is that one can use it to enumerate a set of mutually exclusive outcomes for an event that should each have the same probability of occurring; they are “equiprobable”. A rolled six-sided die should have the same probability, p, for each of its sides rotating to the top. Since p(1) + p(2) + p(3) + p(4) + p(5) + p(6) = 6p = 1, we can calculate that p = 1/6. Finding a set of physically equiprobable outcomes, and using the fact that they all sum to 1, is a reliable way to figure out the probability of the individual outcomes. Of course it is rare that we are fortunate enough to be able to identify a physical symmetry for classifying events. More often the physical considerations that enter into the probability of an event are too complicated to be able to identify a set of equiprobable outcomes. In these cases, the most common method for assigning probabilities is to look at the frequency of events. What is the chance of an earthquake of 5.0 or greater on the Richter scale occurring in New York City next year? Historically, only 2 such earthquakes have occurred in the 300 years or so of seismic record keeping. So the probability should be around 1/150. A common goal of scientific experiments is determining how often some event occurs. We could have determined that there is a 1/6 chance of a die rolling a 3 by repeatedly rolling a die and seeing how frequently 3 turns up. This process is a manifestation of scientific induction. It is critical to realize that the method of assigning probabilities based on observed occurrence rates rests on the assumption of a temporal symmetry, namely that the past conditions that produced events with certain frequencies persist in the present and future. Given that seismic conditions change very slowly, using the frequency of earthquakes in the area for the past few hundred years to infer the probabilities of earthquakes today is a reasonable assumption with which to begin. However, physical conditions change; in the geological past, specific faults would have been considerably more active. An exceptionally common mistake in assigning probabilities is to use the frequency of past events as a guide for present probabilities when the conditions that produced the past frequency have changed.†

  • (^) Symmetries can be misleading: a loaded die is one with external rotational symmetry but an internally asymmetric density that causes certain sides to rotate to the top more frequently. † (^) In the middle of the last decade, large sums of money were lent to many people for the purpose of purchasing homes in the United States. Much of the money was lent under the assumption that no more than 10% of the loans would fail to be repaid, because historically, the default rate for U.S. housing loans had never exceeded 10%. Many lenders failed to realize that the criteria for qualifying for these loans had been relaxed to an unprecedented degree, and in some cases the default rate exceeded 50%. The false assumption of temporal symmetry with regard to the frequency of housing loan defaults was a central precipitator of a global economic crisis.

The probability of two independent events both occurring, such as the first roll of a die turning up a 3 , p(A), and the next roll turning up a 5 , p(B), is the product of the two individual probabilities. To see this, consider that there are 6 equiprobable outcomes of the first roll and 6 equiprobable outcomes of the second roll, for a total of 36 equiprobable two-roll combinations, only one of which is a 3 followed by a 5; p(A & B) = p(A)p(B) or 1/36 = (1/6)(1/6). While it is possible to list all possible equiprobable two-dice rolls, as the number of independent events increases, it is not pragmatic. But just as the independent probabilities for each event multiply, so does the number of equiprobable outcomes, e.g., for ten rolls there are 6^10 equiprobable outcomes and the probability of rolling, say, 3, 5, 3, 5, 3, 5, 3, 5, 3, 5 is 1/6^10. To determine the probability of either of two independent events occurring, we use the fact that (A or B) and ((not A) & (not B)) form a complete set of mutually exclusive outcomes, i.e. either A or B happens or neither of them happens, so that p(A or B) + p((not A) & (not B)) = 1 or p(A or B) = 1 – p((not A) & (not B)). (Equation 1) Since (not A) and (not B) are, like A and B, independent events we also know p((not A) & (not B)) = p(not A)p(not B). (Equation 2) Finally, since A and (not A) are a complete set of mutually exclusive outcomes (as are B and (not B)), we have p(A) + p(not A) = 1 and p(B) + p(not B) = 1. (Equation 3) Combining Equations 1, 2, and 3, we find: p(A or B) = 1 – p((not A) & (not B)) = 1 – p(not A)p(not B) = 1 – (1 – p(A))(1 – p(B)) This relation is somewhat complicated, but is frequently invoked in probability calculations. independent, one would calculate that there is a (0.01) (0.01) (0.01) = one-in-a-million chance that none of them work in an emergency. In reality it is likely that the conditions that cause one valve to fail, say a humid environment that rusts a valve shut, cause all three valves to fail, meaning the failure probabilities are not independent and the possibility of an explosion in an emergency is closer to 1% than one-in-a-million.

Summarizing the rules for independent events: If A and B are independent, i.e. p(A|B) = p(A), then: p(A & B) = p(A)p(B) and p(A or B) = 1(1p(A))(1p(B)) Again these formulas can be extended to cases of three or more independent events. The Probability of Rare Events Our intuitive notions of the likelihood of events become less reliable when considering the probability of events that are very rare. Are you more likely to get struck by lightning or be bitten by a rabid animal? Both may be life-changing (perhaps life-ending) rare events, but which should you be more concerned about? If you were a public official, would you invest in preventing lightning strikes or rabid animal bites? To quantify the probability of rare events we use the same methods of physical symmetry or past frequency. However, quite often we have evaluated the probability of an event happening per day or per year, but want to know the chance of it happening over a longer period of time. On a random day your chance of being struck by lighting is approximately 3.5! 10 -^9 , but what is the probability of being struck at least once in your lifetime? Let us perform the calculation of the probability of a more cheerful rare event, the chance of winning the lottery. From physical symmetry considerations of how lottery numbers are chosen, one can calculate exactly that the chances of a $1 ticket winning the New York Mega Millions jackpot are 1 in 175,711, or p(lottery) = 5.7! 10 -^9. Playing once, it seems highly unlikely you will win, but suppose you play the twice-weekly game every week for the next 50 years, a total of around 5,000 attempts. Since each event, i.e. each lottery drawing, is independent, we can use the rule for independent probabilities to calculate the probability of winning at least once: P(winning the lottery in 5,000 attempts) = 1 – (1 – p(lottery))(1 – p(lottery))(1 – p(lottery))... = 1 – (1– p(lottery))^5 , = 1 – (1– 5.7! 10 -^9 )^5 ,000^ = 2.8! 10 -^5 or ~1 in 35, 00 0 chance. Subtraction of small numbers and raising numbers to large powers are both cumbersome,§^ but there is a useful approximation in calculating rare events. If instead of using the exact formula we had multiplied the number of attempts with the probability of winning for one attempt we would have computed § (^) Tip: don’t round off your answer until you reach the end of the calculation. The first step of the above calculation is 1 - 5.7! 10 -^9 = 0.9999999943. If we had rounded this up to 1, the final answer for the probability would have been 1 - 15000 = 0. Although 2.8! 10 -^5 is very small, it is distinctly different from zero.