





Study with the several resources on Docsity
Earn points by helping other students or get them with a premium plan
Prepare for your exams
Study with the several resources on Docsity
Earn points to download
Earn points by helping other students or get them with a premium plan
Community
Ask the community for help and clear up your study doubts
Discover the best universities in your country according to Docsity users
Free resources
Download our free guides on studying techniques, anxiety management strategies, and thesis advice from Docsity tutors
An introduction to point estimates and confidence intervals for estimating population parameters such as mean, standard deviation, and proportion. It covers the concepts of point estimates, unbiased estimators, and interval estimates using examples. The document also discusses the concept of consistent estimators and the relationship between confidence levels and confidence intervals.
What you will learn
Typology: Exams
1 / 9
This page cannot be seen from the preview
Don't miss anything!
(Lesson 24: Point Estimates of Parameters) 24.
How do we estimate population parameters such as a population mean, standard
deviation (SD), variance, or proportion?
Here is the overview from Lesson 1:
A Statistical Experiment Probability
(Descriptive Statistics) |
(Inferential Statistics)
Size: N elements (or members)
Population [of interest] All adult Americans?
All registered voters in California?
Size: n elements (or members)
Sample For a poll? A scientific study?
Are we testing products for quality control?
We now enter the realm of inferential statistics, the science of interpreting data.
What can a sample tell us about the population from which it is drawn?
(Lesson 24: Point Estimates of Parameters) 24.
If a single value is used to estimate a population parameter, we call that value a
point estimate for that parameter.
Let’s say we draw a sample from a population.
Mean SD VAR
Population (Size N ) μ σ σ
2
Sample (Size n )
x s s
2
We use sample statistics to estimate population parameters.
mean , μ.
population standard deviation , σ.
2
, as a point estimate for the
population variance , σ
2
Example 1 (Point Estimates)
A large lecture class takes a test. Five of the tests are randomly selected and
graded. The grader reports that x = 75 points , s = 15 points , and
s
2
= 225 square points.
entire class?
scores for the entire class?
the entire class?
(Lesson 24: Point Estimates of Parameters) 24.
§ Solution
entire class?
The point estimate is the sample mean , x = 75 points.
scores for the entire class?
The point estimate is the sample standard deviation , s = 15 points.
the entire class?
The point estimate is the sample variance , s
2
= 225 square points.
A sample statistic is an unbiased estimator for a population parameter when the
expected value of the sample statistic is the value of the population parameter.
A key reason why we use the sample mean as a point estimate for the population
mean is that the sample mean is an unbiased estimator for the population mean:
= μ.
class … and all of their sample means (values of X ). The average of those
sample means would be the population mean , μ.
underestimate the population parameter.
The sample variance is an unbiased estimator for the population variance.
The sample standard deviation is not an unbiased estimator for the population
standard deviation , but it is still good enough to use as a point estimate.
(Lesson 24: Point Estimates of Parameters) 24.
p denotes a population proportion.
Bin n , p
distribution.
p ˆ denotes a sample proportion. It is an unbiased estimator of p.
Proportion
Population (Size N ) p
Sample (Size n )
p
Calculating a Sample Proportion
A success is a property that we are interested in.
If n is a sample size , and if x is the number of successes in the sample,
then the sample proportion of successes is given by:
p =
x
n
If n is the number of trials in a binomial experiment , and if
x is the number of “successful” trials , then our point estimate for p ,
the success probability per trial in a Bin n , p
distribution, is given by:
p =
x
n
This is the sample proportion of successes among the n trials.
(Lesson 25: Confidence Interval Estimates of Parameters) 25.
A sample often does not represent the population (or a probability) perfectly, and
this leads to sampling error.
A point estimate , typically a sample statistic such as the sample mean x , may or
may not equal the true value of the population parameter that is being estimated.
Example 1 (Sampling Error)
while the sample mean for five randomly selected exams from that class
could be 74 points. That is, μ = 75 points , while x = 74 points.
probability that the coin comes up heads, then p = 0.5 for a fair coin, while
the sample proportion p ˆ = 0.51.
(Lesson 25: Confidence Interval Estimates of Parameters) 25.
To account for sampling error, we will provide a range of values, called an interval
estimate for a population parameter, such as a population mean.
Example 2 (Interval Estimate for a Population Mean)
A lecture class takes an exam. We want an interval estimate for μ , the
population mean of exam scores in the class. A random sample of five
exams is selected and graded. Our interval estimate for μ could be the
purple interval
70 points, 78 points ( ) below.
It is the average of the lower limit and the upper limit:
x =
lower limit ( )
= 74 points
(here, x = 74 points ). This will not always be the case; we will use
asymmetric intervals when estimating standard deviations and variances.
(Lesson 25: Confidence Interval Estimates of Parameters) 25.
If an interval estimate is symmetric about the point estimate, then we may write
the interval in terms of the point estimate and the margin of error.
Margin of Error of a Symmetric Interval Estimate
point estimate and either limit of the interval.
the true value of the population parameter being estimated.
Example 3 (Margin of Error; Revisiting Example 2)
In Example 2, we estimated the population mean of exam scores in a class.
Our point estimate was the sample mean x = 74 points. Our interval
estimate
70 points, 78 points ( ) was symmetric about the point estimate.
There are three ways to calculate the margin of error E :
− x = 78 − 74 = 4 points
= 74 − 70 = 4 points
upper limit ( )
− lower limit ( )
= 4 points
The interval estimate (70 poi nts, 78 points) for the population mean μ
can be written in terms of the sample mean x and the margin of error E :
μ = x ± E
μ = 74 ± 4 in points ( )
We could informally say that we believe that the population mean is about
74 points, give or take 4 points.
(Lesson 25: Confidence Interval Estimates of Parameters) 25.
We believe that an interval estimate for a population parameter (such as the
population mean μ ) is likely to contain the value of that parameter.
If we attach a confidence level to the interval estimate, then we have a
confidence interval (“CI”) estimate for the parameter.
A confidence level is a probability that is often expressed as a percent.
- It is the probability that the confidence interval contains the true value of
the parameter.
assumed confidence level in published studies and news reports.
Example 4 (Confidence Levels and Confidence Intervals)
Let μ be the population mean I.Q. score of American adults.
A psychology professor wants to estimate μ. The professor analyzes
a sample of American adult I.Q.s. The sample mean x = 101 points
and a 95% confidence interval (CI) for μ is 98 points, 104 points ( )
Interpret this confidence interval (CI).
§ Solution
We are 95% confident that this interval contains the population mean I.Q.
score of American adults.
chance” that μ “falls in the confidence interval.” See the next page …
(Lesson 26: z , t , and χ
2
Distributions) 26.
2
PART A: REVIEW OF STANDARD NORMAL ( z ) DISTRIBUTIONS
The standard normal ( z ) distribution is the normal distribution with mean 0 and
standard deviation 1 :
Z ~ N μ = 0 , σ = 1 ( )
Properties of Standard Normal ( z ) Distributions
is 1. This is true for all probability density curves.
either inflection point (“IP”); “IP”s are points where the density curve
changes from concave up (curving upward) to concave down (curving
downward), or vice-versa.
General Normal Distribution Standard Normal Distribution
(Lesson 26: z , t , and χ
2
Distributions) 26.
PART B: t DISTRIBUTIONS
t distributions, also called Student’s t distributions, are similar to the standard
normal ( z ) distribution.
Similarities Between t and z Distributions
Differences Between t and z Distributions
“ z curve.”
meaningful as for a z distribution. (See Footnote 1.)
(Lesson 26: z , t , and χ
2
Distributions) 26.
Degrees of Freedom (df) for a t Distribution
of degrees of freedom (df).
the t distribution on n − 1 ( )
df.
distribution. The tails get thinner. The standard deviation σ decreases
and approaches 1, which is the standard deviation of the z distribution.
the z distribution.
In the figure below:
(Lesson 26: z , t , and χ
2
Distributions) 26.
PART C : χ
2
χ
2
distributions , written out as “chi-square distributions”, have both similarities
and key differences compared to the z and t distributions.
Similarities that χ
2
Distributions Have with z and t Distributions
2
distribution has
some similarities to the number of df for a t distribution.
Degrees of Freedom (df) for a χ
2
Distribution
2
distributions corresponding to different
numbers of degrees of freedom (df).
the χ
2
distribution on n − 1 ( )
df.
2
distribution looks more like a
normal distribution, though not the standard normal z distribution
(which the t distributions could resemble).
2
distribution is often approximated
by a normal distribution , though not the z distribution.