Title Bout: MLE vs. MoM Opponents: R.A. Fisher and Karl
Download
Report
Transcript Title Bout: MLE vs. MoM Opponents: R.A. Fisher and Karl
Title Bout: MLE vs. MoM
Opponents: R.A. Fisher and Karl Pearson
By Lada Kyj and Galen Papkov
Rice University - STAT 600
September 27, 2004
Outline
Short Biographies
Journal rage
Criterion
Method of Moments
Maximum Likelihood
MLE vs. MoM
Issues
Who’s who?
R.A. Fisher (Born February 17, 1890 in
London, England)
Superstitious fact
Studied mathematics and astronomy at Cambridge (also
interested in biology)
Rejected from enlisting in the military for WWI due to poor
eyesight
Introduced concepts such as randomization, likelihood, and
ANOVA
Karl Pearson (born March 27, 1857 in
London, England)
Attended Cambridge
Had various interests
–
Mathematics, physics, metaphysics, law, etc…
Contributed to regression analysis and developed
the correlation coefficient and the chi-square test.
Is characterized as trying to use large samples to
deduce correlations in the data whereas Fisher used
small samples to determine causes.
Journal Rage
Began in 1917 when Pearson claimed that
Fisher had failed to distinguish likelihood
from inverse probability in a paper he wrote
in 1915
Feud continued for many years
Fire of feud fed by injustice
–
“It would be fair to say that both showed hatred
towards the other.”
Criteria of Consistency
A statistic is consistent if, when it is
calculated from the population, it is equal to
the population parameter.
PROBLEM!
–
Many statistics for the same parameter can be
consistent.
Criteria of Efficiency
A statistic is efficient if, when derived from a
large sample, it tends to a normal distribution
with minimum variance.
Relates to estimation accuracy
PROBLEM!
–
This criterion is still incomplete since different
methods of calculation may tend to agreement for
large samples, but not for finite samples.
Criteria of Sufficiency
A statistic is sufficient when no other statistic
which can be calculated from the same
sample provides any additional information
per the value of the parameter to be
estimated.
Relates to “information”
Method of Moments
Developed by Pearson in 1894
Method: mk = E[Xk]
m1 = E[X] = Xbar (sample mean)
Satisfies consistency
Example
The Cauchy distribution is a great example
that portrays the limitations of the MoM.
–
–
A few outliers appear to dominate the value of the
mean.
Cannot use MoM, go to MLE
Maximum Likelihood Estimation
Developed by Fisher
–
Officially called maximum likelihood estimation in
1921
“The likelihood of a parameter is proportional
to the probability of the data.”
Method:
–
–
Obtain L(x;q)
Take first derivative, set = 0, solve for q.
Conditions for Determining
Maximum
First order conditions:
Second order conditions:
(Hessian is negative definite)
Maximum Likelihood Estimation
(cont.)
Criterion:
–
–
–
Consistency
Efficiency
Sufficiency – poor “proof”!
Asymptotically normal and invariant
Led to the development of the factorization theorem.
Is used as an efficiency yardstick for other
estimators, such as the method of moments.
MLE vs. MoM
MoM is easy to use for normal curves.
–
Mean is the best statistic for locating this curve
MLE:
–
–
–
–
Evaluating and maximizing likelihood function is
often challenging
Difficult to write down complete statistical model
of the joint distribution of the data
More robust
Greater efficiency
Issues
Neither are great for exploratory data
analysis since the underlying distribution
must be known.
MLE – is believed to be sufficient (an
acceptable proof has not been derived)
References
http://www-groups.dcs.st-and.ac.uk/~history/Mathematicians/Fisher.html
http://www-groups.dcs.st-and.ac.uk/~history/Mathematicians/Pearson.html
Aldrich, John (1997). R.A. Fisher and the Making of Maximum Likelihood
1912-1922. Statistical Science, Vol. 12, No. 3, p. 162-176.
Fisher, R. A. (1921a). On the mathematical foundations of theoretical
statistics. P