Variance of a Point Estimator
Download
Report
Transcript Variance of a Point Estimator
Chapter 7
Point Estimation of
Parameters
Learning Objectives
• Explain the general concepts of estimating
• Explain important properties of point estimators
• Know how to construct point estimators using
the method of maximum likelihood
• Understand the central limit theorem
• Explain the important role of the normal
distribution
Statistical Inference
• Used to make decisions or to draw conclusions
about a population
• Utilize the information contained in a sample
from the population
• Divided into two major areas
– parameter estimation
– hypothesis testing
• Use sample data to compute a number
• Called a point estimate
Statistic and Sampling Distribution
• Obtain a point estimate of a population
parameter
• Observations are random variables
• Any function of the observation, or any statistic,
is also a random variable
• Sample mean and sample variance are statistics
• Has a probability distribution
• Call the probability distribution of a statistic a
sampling distribution
Definition of the Point Estimate
• Suppose we need to estimate the mean of a single
population by a sample mean
– Population mean, , is the unknown parameter
– Estimator of the unknown parameter is the sample mean X
– X is a statistic and can take on any value
• Convenient to have a general symbol
• Symbols are used in parameter estimation
– Unknown population parameter id denoted by
– Point estimate of this parameter by ˆ
– Point estimator is a statistic and is denoted by ̂
General Concepts of Point
Estimation
• Unbiased Estimator
– Estimator should be “close” to the true value of the unknown
parameter
– Estimator is unbiased when its expected value is equal to the
parameter of interest
ˆ
E ()
– Bias is zero
• Variance of a Point Estimator
– Considering all unbiased estimators, the one with the smallest
variance is called the minimum variance unbiased estimator
(MVUE)
– MVUE is most likely estimator that gives a close value to the true
value of the parameter of interest
Standard Error
• Measure of precision can be indicated by the
standard error
• Sampling from a normal distribution with mean
and variance 2
• Distribution of X is normal with mean and
variance 2/n
• Standard error of X
x
n
• Not know , we will substitute the s into the
above equation
s
̂ x
n
Mean Square Error (MSE)
• It is necessary to use a biased estimator
• Mean square error of the estimator can be used
• Mean square error of an estimator is difference
between the estimator and the unknown parameter
ˆ ) E (
ˆ )2
MSE (
Eq.7-3
• An estimator is an unbiased estimator
– If the MSE of the estimator is equal to the variance of the
estimator
– Bias is equal to zero
Relative Efficiency
• Suppose we have two estimators of a
parameter with their corresponding mean
square errors
• Defined as
ˆ
MSE ( 1 )
ˆ )
MSE (
2
• If this relative efficiency is less than 1
• Conclude that the first estimator give us a more
efficient estimator of the unknown parameter
than the second estimator
• Smaller mean square error
Example
• Suppose we have a random sample of
size 2n from a population denoted by X,
and E(X)= and V(X)= 2
• Let
be two estimators of
• Which is the better estimator of ? Explain
your choice.
Solution
• Expected values are
2n
Xi
1 2n
1
2n
E X 1 E i 1
E X i
2n 2n i 1 2n
n
Xi
1 n
1
E X 2 E i 1 E X i n
n n i 1 n
• X and X are unbiased estimators of
• Variances are
1
2
V X 1
• MSE
2
2n
V X 2
2
n
MSE (ˆ1 ) 2 / 2n n 1
2
MSE (ˆ2 ) / n 2n 2
• Conclude thatX is the “better” estimator with the smaller
variance
1
Methods of Point Estimation
• Definition of unbiasness and other properties do
not provide any guidance about how good
estimators can be obtained
• Discuss the method of maximum likelihood
• Estimator will be the value of the parameter that
maximizes the probability of occurrence of the
sample values
Definition
• Let X be a random variable with probability
distribution f(x;)
– is a single unknown parameter
– Let x1, x2, …, xn be the observed values in a
random sample of size n
– Then the likelihood function of the sample
L()= f(x1:). f(x2:). … f(xn:)
Sampling Distribution of
Mean
• Sample mean is a statistic
– Random variable that depends on the results
obtained in each particular sample
• Employs a probability distribution
• Probability distribution of X is a sampling
distribution
– Called sampling distribution of the mean
Sampling Distributions of Means
• Determine the sampling distribution of the sample
mean X
• Random sample of size n is taken from a normal
population with mean and variance 2
• Each observation is a normally and independently
distributed random variable with mean and
variance 2
Sampling Distributions of MeansCont.
• By the reproductive property of the normal distribution
x
X 1 X 2 .... X n
n
• X-bar has a normal distribution with mean
x
...
n
• Variance
x
2
2 2 .... 2
n
2
2
n
Central Limit Theorem
• Sampling from an unknown probability distribution
• Sampling distribution of the sample mean will be
approximately normal with mean and variance 2/n
• Limiting form of the distribution of X
Z
X
/ n
• Most useful theorems in statistics, called the central limit
theorem
• If n 30, the normal approximation will be satisfactory
regardless of the shape of the population
Two Independent Populations
• Consider a case in which we have two independent
populations
– First population with mean 1 and variance 21 and the second
population with mean 2 and variance 22
– Both populations are normally distributed
– Linear combinations of independent normal random variables
follow a normal distribution
• Sampling distribution of
and variance
X1 X 2
is normal with mean
x x x x 1 2
1
2
2
x1 x2
1
2
2
x1
2
x2
12
n1
22
n2