Stats Review Lecture 5 - Limit Theorems 07.25.12

Download Report

Transcript Stats Review Lecture 5 - Limit Theorems 07.25.12

Chebyshev’s Inequality
• Markov’s Inequality
• Proposition 2.1.
Chebyshev’s Inequality
• Chebyshev’s Inequality:
• Proposition 2.2.
• Consider Example 2a
Convergence in probability
• A sequence of random variables, X1, X2, …,
converges in probability to a random
variable X if, for every e > 0,
• or equivalently,
The weak law of large numbers
• Theorem 2.1. The weak law of large numbers
• Proof:
Almost Sure Convergence
The Strong Law of Large
Numbers
• Theorem 4.1, p. 400
Convergence in distribution
• A sequence of random variables, X1, X2, …,
converges in distribution to a random
variable X if
• at all points x where FX(x) is continuous.
• This really says that the CDFs converge
Central Limit Theorem
• Theorem 3.1.
• For iid random variables Xi
• Consider Examples 3b and 3c, p. 396
Central limit theorem for
independent random variables
• Theorem 3.2, p. 399.
(a) The is uniformly bounded, meaning for
some M,
(b) and
Jensen’s ineqality
• Proposition 5.3, p. 409
• If f is convex
• Consider Example 5f.