Transcript Chap006

Dr. Ka-fu Wong
ECON1003
Analysis of Economic Data
Ka-fu Wong © 2003
Chap 6- 1
Chapter Six
Sampling Methods and the Central Limit
Theorem
GOALS





l
Explain why a sample is the only feasible way to
learn about a population.
Describe methods to select a sample.
Define and construct a sampling distribution of
the sample mean.
Explain the central limit theorem.
Use the Central Limit Theorem to find
probabilities of selecting possible sample means
from a specified population.
Ka-fu Wong © 2003
Chap 6- 2
Why Sample the Population?
 The physical impossibility of checking all
items in the population.
 The cost of studying all the items in a
population.
 The sample results are usually adequate.
 Contacting the whole population would often
be time-consuming.
 The destructive nature of certain tests.
Ka-fu Wong © 2003
Chap 6- 3
Probability Sampling
 A probability sample is a sample selected such
that each item or person in the population being
studied has a known likelihood of being included
in the sample.
Ka-fu Wong © 2003
Chap 6- 4
Methods of Probability Sampling
 Simple Random Sample: A sample
formulated so that each item or person in
the population has the same chance of
being included.
 Systematic Random Sampling: The items or
individuals of the population are arranged in
some order. A random starting point is
selected and then every kth member of the
population is selected for the sample.
Ka-fu Wong © 2003
Chap 6- 5
Methods of Probability Sampling
 Stratified Random Sampling: A population is
first divided into subgroups, called strata, and
a sample is selected from each stratum.
 Cluster Sampling: A population is first divided
into primary units then samples are selected
from the primary units.
Ka-fu Wong © 2003
Chap 6- 6
Potential problems with the sampling method of
“Sampling Straws”
 Choice of sampling method is important.
 An exercise of “Sampling Straws” experiments will
illustrate that some sampling method can produce a
biased estimate of the population parameters.
 The bag contain a total of 12 straws, 4 of which are 4
inches in length, 4 are 2 inches long, and 4 are 1 inch
long.
 The population mean length is 2.33 (=4*(1+2+4)/12)
 Randomly draw 4 straws one by one with
replacement.
 Compute the sample mean.
 The average of the sample means of experiments is
generally larger than 2.33.
Ka-fu Wong © 2003
Chap 6- 7
Methods of Probability Sampling
 “Sampling Straws” experiments
 The bag contain a total of 12 straws, 4 of which are 4
inches in length, 4 are 2 inches long, and 4 are 1 inch long.
 The population mean length is 2.33 (=4*(1+2+4)/12)
 Randomly draw 4 straws one by one with replacement.
 Compute the sample mean.
 The average of the sample means of experiments is generally
larger than 2.33.
 The sample scheme is biased because the longer straws
have a higher chance of being drawn, if the draw is truly
random (say, draw your first touched straw).
 The draw may not be random because we can feel the
length of the straw before we pull out the straw.
Ka-fu Wong © 2003
Chap 6- 8
Methods of Probability Sampling
 “Sampling Straws” experiments
 The bag contain a total of 12 straws, 4 of which are 4
inches in length, 4 are 2 inches long, and 4 are 1 inch long.
 The population mean length is 2.33 (=4*(1+2+4)/12)
 Randomly draw 4 straws one by one with replacement.
 Compute the sample mean.
 The average of the sample means of experiments is generally
larger than 2.33.
 Alternative sampling scheme:
 Label the straws 1 to 12.
 Label 12 identical balls 1 to 12.
 Draw four balls with replacement.
 Measure the corresponding straws and compute the
sample mean.
Ka-fu Wong © 2003
Chap 6- 9
Methods of Probability Sampling
 In nonprobability sample, whether an
observation is included in the sample is based on
the judgment of the person selecting the sample.
 The sampling error is the difference between a
sample statistic and its corresponding population
parameter.
 Sampling error is almost always nonzero.
Ka-fu Wong © 2003
Chap 6- 10
Sampling Distribution of the Sample
Means
 The sampling distribution of the sample mean
is a probability distribution consisting of all
possible sample means of a given sample size
selected from a population.
Ka-fu Wong © 2003
Chap 6- 11
EXAMPLE 1
 The law firm of Hoya and Associates has five partners. At
their weekly partners meeting each reported the number of
hours they billed clients for their services last week.
1.
2.
3.
4.
5.
Partner
Hours
Dunn
Hardy
Kiers
Malinowski
Tillman
22
26
30
26
22
 The population mean is 25.2 hours.
22  26  30  26  22

 25.2
5
Ka-fu Wong © 2003
Chap 6- 12
Example 1
 If two partners are selected randomly, how many different
samples are possible?
This is the combination of 5 objects taken 2 at a time. That
is:
5!
 10
5 C2 
2! (5  2)!
There are a total of 10 different samples.
Ka-fu Wong © 2003
Chap 6- 13
Example 1
Ka-fu Wong © 2003
continued
Partners
Total
Mean
1,2
48
24
1,3
52
26
1,4
48
24
1,5
44
22
2,3
56
28
2,4
52
26
2,5
48
24
3,4
56
28
2,4
52
26
2,5
48
24
Chap 6- 14
EXAMPLE 1
continued
 Organize the sample means into a sampling
distribution.
Sample Mean
Frequency
Relative Frequency probability
22
1
1/10
24
4
4/10
26
28
3
2
3/10
2/10
 The mean of the sample means is 25.2 hours.
22(1)  24( 4)  26(3)  28(2)
X 
 25.2
10
The mean of the sample means is exactly equal to the population mean.
Ka-fu Wong © 2003
Chap 6- 15
Example 1
 Population variance
= [ (22-25.2)2+(26-25.2)2 +… + (22-25.2)2 ] / 5 = 8.96
 Variance of the sample means:
=[ (1)(22-25.2)2+(4)(24-25.2)2 + (3)(26-25.2)2 + (2)(22-25.2)2 ]
/ ( 1+2+3+2) = 3.36
 The variance of sample means < variance of population
variance
 3.36/8.96 = 0.375 <1
Note that this is like sampling without replacement.
Ka-fu Wong © 2003
Chap 6- 16
Example
 Suppose we had a uniformly
distributed population
containing equal proportions
(hence equally probable
instances) of (0,1,2,3,4). If
you were to draw a very large
number of random samples
from this population, each of
size n=2, the possible
combinations of drawn values
and the sums are
Sums
Combinations
0
0,0
1
0,1 1,0
2
1,1 2,0 0,2
3
1,2 2,1 3,0 0,3
4
1,3 3,1 2,2 4,0 0,4
5
1,4 4,1 3,2 2,3
6
3,3 4,2 2,4
7
3,4 4,3
8
4,4
Note that this is sampling with replacement.
Ka-fu Wong © 2003
Chap 6- 17
Example
 Population mean = mean of sample means
 Population mean
= (0+1+2+3+4)/5=2
 Mean of sample means
= [ (1)(0) + (2)(0.5) + …+(1)(4) ] / 25
=2
Means
 Variance of sample means
= Population variance/ sample size
 Population variance
=(0-2)2 + … + (4-2)2 / 5
=2
 Variance of sample means
=(1)(0-2)2+… +(1)(4-2)2 / 25
=1
Ka-fu Wong © 2003
Combinations
0.0
0,0
0.5
0,1 1,0
1.0
1,1 2,0 0,2
1.5
1,2 2,1 3,0 0,3
2.0
1,3 3,1 2,2 4,0 0,4
2.5
1,4 4,1 3,2 2,3
3.0
3,3 4,2 2,4
3.5
3,4 4,3
4.0
4,4
Chap 6- 18
Probability Histograms
 In a probability histograms, the area of the bar
represents the chance of a value happening as a
result of the random (chance) process
 Empirical histograms (from observed data) for
a process converge to the probability
histogram
Ka-fu Wong © 2003
Chap 6- 19
Examples of empirical histogram
 Roll a fair die: 50, 200 times
30
50 times
200 times
20
20
10
10
Percent
Percent
30
0
1
DIE
2
3
4
5
0
6
1
2
3
4
5
6
DIE
The empirical histogram will approach the probability
histogram as the number of draws increase.
Ka-fu Wong © 2003
Chap 6- 20
Empirical histogram #1
Two balls in the bag:
Draw 1 ball 1000 times with replacement. Plot a relative
frequency histogram (empirical probability histogram).
0.5
The empirical histogram looks like
the population distribution !!!
What is the probability of getting
a red ball in any single draw?
0.5
Ka-fu Wong © 2003
Chap 6- 21
Empirical histogram #2
5 balls in the bag:
Draw 1 ball 1000 times with replacement. Plot a relative
frequency histogram (empirical probability histogram).
0.6
0.4
The empirical histogram looks like
the population distribution !!!
What is the probability of getting
a red ball in any single draw?
0.6
Ka-fu Wong © 2003
Chap 6- 22
Empirical histogram #3
5 balls in the bag:
0
1
2
3
4
Draw 1 ball 1000 times with replacement. Plot a relative
frequency histogram (empirical probability histogram).
The empirical histogram looks like
the population distribution !!!
What is the probability of getting a
“three” in any single draw? 0.2
0.2
0
Ka-fu Wong © 2003
1
2
3
What is the expected value (i.e.,
population mean) of a single draw?
0.2*0 + 0.2*1 + … + 0.2*4 = 2
Variance = 0.2*(-2)2 + 0.2*(-1)2
+… +0.2*(2)2 = 2
4
Chap 6- 23
Empirical histogram #3 continued
5 balls in the bag:
0
1
2
3
4
Draw 2 balls 1000 times with replacement. Compute the sample
mean. Plot a relative frequency histogram (empirical probability
histogram) of the 1000 sample means.
Means
Combinations
0.0
0,0
0.5
0,1 1,0
1.0
1,1 2,0 0,2
1.5
1,2 2,1 3,0 0,3
2.0
1,3 3,1 2,2 4,0 0,4
2.5
1,4 4,1 3,2 2,3
3.0
3,3 4,2 2,4
3.5
3,4 4,3
4.0
4,4
Ka-fu Wong © 2003
All combinations are equally likely.
0.2
0.18
0.16
0.14
0.12
0.1
0.08
0.06
0.04
0.02
0
0
0.5
1
1.5
2
2.5
3
3.5
4
Chap 6- 24
Empirical histogram #3 continued
5 balls in the bag:
0
1
2
3
4
Draw 2 ball 1000 times with replacement. Compute the sample
mean. Plot a relative frequency histogram (empirical probability
histogram) of the 1000 sample means.
What is the probability of getting
a sample mean of 2.5 in any single
0.2
draw?
0.16
0.18
0.16
0.14
0.12
0.1
0.08
0.06
0.04
0.02
0
What is the expected sample
mean of a single draw?
0.04*0 + 0.08*0.5 +… + 0.04*4
=2
0
0.5
Ka-fu Wong © 2003
1
1.5
2
2.5
3
3.5
4
Variance of sample mean =
0.04*(-2)2 + 0.08 *(-1.5)2 + … +
0.04*(2)2 = 1
Chap 6- 25
Central Limit Theorem #1
5 balls in the bag:
0
1
2
3
4
Draw n (n>30) ball 1000 times with replacement. Compute the
sample mean. Plot a relative frequency histogram (empirical
probability histogram) of the 1000 sample means.
The Central Limit Theorem says
1. The empirical histogram looks like a normal density.
2. Expected value (mean of the normal distribution) = mean of the
original population mean = 2.
3. Variance of the sample means = variance of the original
population /n = 2/n.
Ka-fu Wong © 2003
Chap 6- 29
Central Limit Theorem #2
Some unknown number of numbered balls in the bag:
0
1
2
3
4
?
?
We know only that the population mean is  and the variance is 2.
Draw n (n>30) ball 1000 times with replacement. Compute the
sample mean. Plot a relative frequency histogram (empirical
probability histogram) of the 1000 sample means.
The Central Limit Theorem says
1. The empirical histogram looks like a normal density.
2. Expected value (mean of the normal distribution) = .
3. Variance of the sample means = 2 /n.
Ka-fu Wong © 2003
Chap 6- 30
Confidence interval #1
Some unknown number of numbered balls in the bag:
0
1
2
3
4
?
?
We know only that the population mean is  and the variance is 2.
The Central Limit Theorem says
1. The empirical histogram looks like a normal density.
2. Expected value (mean of the normal distribution) = .
3. Variance of the sample means = 2 /n.
What is the probability that the sample mean of a randomly drawn
sample lies between   /n ?
0.6826
Ka-fu Wong © 2003
Chap 6- 31
Central Limit Theorem
 For a population with a mean  and a variance 2 the
sampling distribution of the means of all possible samples
of size n generated from the population will be
approximately normally distributed.
 The mean of the sampling distribution equal to  and the
variance equal to 2/n.
The population distribution
The sample mean of n observation
Ka-fu Wong © 2003
X ~ N( , )
2
X n ~ N (  , / n )
2
Chap 6- 32
Central Limit Theorem: Sums
 For a large number of random draws, with replacement,
the distribution of the sum approximately follows the
normal distribution
 Mean of the normal distribution is
 n* (expected value of one random draw)
 SD for the sum (SE) is
n 
 This holds even if the underlying population is not
normally distributed
Ka-fu Wong © 2003
Chap 6- 33
Central Limit Theorem: Averages
 For a large number of random draws, with replacement,
the distribution of the average = (sum)/n approximately
follows the normal distribution
 The mean for this normal distribution is
 (expected value of one random draw)
 The SD for the average (SE) is

n
 This holds even if the underlying population is not
normally distributed
Ka-fu Wong © 2003
Chap 6- 34
Law of large numbers
 The sample mean converges to the population
mean as n gets large.
 For a large number of random draws from any
population, with replacement, the distribution of the
average = (sum)/n approximately follows the normal
distribution
 The mean for this normal distribution is the
(expected value of one random draw)
 The SD for the average (SE) is 
n
 SD for the average tends to zero as n increases.
 This holds even if the underlying population is not
normally distributed
Ka-fu Wong © 2003
Chap 6- 35
Point Estimates
 Examples of point estimates are the sample
mean, the sample standard deviation, the sample
variance, the sample proportion.
 A point estimate is one value ( a single point)
that is used to estimate a population parameter.
Ka-fu Wong © 2003
Chap 6- 36
Independent identically distributed (iid)
 “random draws from any population, with replacement” is
also known as independent identically distributed.
 Independent: the probability of drawing the current
observation does not depend on what has been drawn
previously.
 Identically distributed: the probability of drawing the
current observation is the same as what has been drawn
previously and what will be drawn in the future.
The CLT may still hold even when we do not have iid
observations.
Ka-fu Wong © 2003
Chap 6- 37
Central Limit Theorem Simulation
Ka-fu Wong © 2003
Chap 6- 38
Effect of Sample Size
Regardless of the
underlying
population, the
larger the sample
size, the more
nearly normally
distributed is the
population of all
possible sample
means.
Ka-fu Wong © 2003
Chap 6- 39
Point Estimates
 If a population follows the normal distribution,
the sampling distribution of the sample mean
will also follow the normal distribution.
 To determine the probability a sample mean falls
within a particular region, use:
X 
z
 n
Ka-fu Wong © 2003
Chap 6- 40
Point Estimates
 If the population does not follow the normal
distribution, but the sample is of at least 30
observations, the sample means will follow the
normal distribution.
 To determine the probability a sample mean falls
within a particular region, use:
X 
z
s n
Ka-fu Wong © 2003
Chap 6- 41
Example 2
 Suppose the mean selling price of a gallon of
gasoline in the United States is $1.30. Further,
assume the distribution is positively skewed,
with a standard deviation of $0.28. What is the
probability of selecting a sample of 35 gasoline
stations and finding the sample mean within
$.08 of the population mean ($1.30)?
Ka-fu Wong © 2003
Chap 6- 42
Example 2
continued
 The first step is to find the z-values
corresponding to $1.22 (=1.30-0.08) and
$1.38 (=1.30+0.08). These are the two
points within $0.08 of the population mean.
X   $1.38  $1.30
z

 1.69
s n
$0.28 35
X   $1.22  $1.30
z

 1.69
s n
$0.28 35
Ka-fu Wong © 2003
Chap 6- 43
Example 2
continued
 Next we determine the probability of a z-value
between -1.69 and 1.69. It is:
P ( 1.69  z  1.69)  2(.4545)  .9090
 We would expect about 91 percent of the sample
means to be within $0.08 of the population mean.
Ka-fu Wong © 2003
Chap 6- 44
Estimating the percentage of Earth
covered by water
 Experiments:
 Paint a dot on your thumb.
 Catch the globe and tell me whether the dot on your thumb
land on water.
 Estimate the percentage of Earth covered by water by the average
of all trials.
 Idea: If we draw many observations with replacement, the sample
average will approach the population proportion. Code water as 1
and land as 0, the sample average will be an estimate of the
proportion will be the percentage of Earth covered by water.
Truth:
Water covers 71% of the Earth's surface.
e.g., http://pao.cnmoc.navy.mil/educate/neptune/trivia/earth.htm
Ka-fu Wong © 2003
Chap 6- 45
Sampling Distribution of Sample
Proportion
If a random sample of size n is taken from a
population then the sampling distribution of the
sample proportion
is
p̂
Approximately normal, if n is large.
Has mean
 p̂ = p
Has standard deviation  p̂
p(1 - p)
=
n
Approximately normal because the sample proportion is a
simple average of zeros and ones from difference trials.
Ka-fu Wong © 2003
Chap 6- 46
The Normal Approximation to the
Binomial revisited
 The normal distribution (a continuous distribution)
yields a good approximation of the binomial
distribution (a discrete distribution) for large values of
n.
 The normal probability distribution is generally a good
approximation to the binomial probability distribution
when n  and n(1-  ) are both greater than 5.
iid
Recall for the binomial experiment:
 There are only two mutually exclusive outcomes
(success or failure) on each trial.
 A binomial distribution results from counting the number
of successes.
 Each trial is independent.
 The probability is fixed from trial to trial, and the
number of trials n is also fixed.
Ka-fu Wong © 2003
Chap 6- 47
The Normal Approximation to the
Binomial revisited
Recoding: Failure as 0 and success as 1.
 x/n is simply the proportion of success and hence the
simple average of the outcomes from the n trials.
 x/n will be approximately normal according to CLT.
 Hence x (=n*x/n) will also be approximately normal
according to CLT.
Ka-fu Wong © 2003
Chap 6- 48
Chapter six
Sampling Methods and the Central Limit
Theorem
- END -
Ka-fu Wong © 2003
Chap 6- 49