Lecture 4: Web Chapter 1

Download Report

Transcript Lecture 4: Web Chapter 1

Lecture 11:
Chapter 9
David Wallace Croft
http://www.CroftSoft.com/people/david/
Statistics for Psychology
2005 Jun 17 Fri
Copyright 2005 David Wallace Croft
This work is licensed under the
Creative Commons Attribution License 2.0.
http://creativecommons.org/licenses/by/2.0/
Quiz
•
•
•
•
•
•
•
Please mute your mobile phones
Write your name on a blank piece of paper
Quiz will begin at 09:00
When done, turn your paper over
At 09:02, I will say, “Pens down”
Writing after “Pens down” is cheating
Pass your quizzes to your left
Outline
•
•
•
•
•
•
•
•
•
Administrative
Chapter 9 Slides
Emphasis
Example
Assignment
Exam 4 Prep
Questions
Brief Recess
Homework Review
Administrative
• All quiz grades in Blackboard now
• Exam 3 not graded yet
• Free statistics tutoring
– UTD Learning Resources Center
– Located in the library
Chapter 9
Introduction to the t Test
Aron and Aron, “Statistics for Psychology”, 3rd Ed., 2003.
Emphasis
• Why divide by N – 1?
• Why divide by N?
• Why df and t distribution?
Why Divide by N – 1?
• Why use S2 = SS / ( N – 1 )?
• Expected Value
E ( X ) = ∑ ( x * px ) = μ
• S2 is unbiased estimator for σ2
E ( S2 ) = … = σ2
Proof on page 271 of
Larsen, Richard J. and Morris L. Marx,
“An Introduction to Probability and Its Applications”,
Prentice-Hall, 1985.
Why Divide by N?
•
•
•
•
•
Why does σM2 = σ2 / N?
Yi ≡ Xi / N
M = (∑Xi ) / N = ∑(Xi / N ) = ∑Yi
E ( Yi ) = M / N
Var ( Yi ) = E ( [ Yi – E ( Yi ) ]2 )
= E ( [ Xi / N – M / N ]2 )
= E ( [ Xi – M ]2 ) / N2
= σ2 / N2
• Var ( ∑Yi ) = ∑ Var ( Yi ) (Yi’s independent)
• σM2 = Var ( M ) = Var ( ∑Yi ) = ∑ Var ( Yi )
= ∑ ( σ2 / N2 ) = N * ( σ2 / N2 ) = σ2 / N
Why df and t distribution?
• df = N – 1
•
•
•
•
t distribution thicker in tails
Robust unless extreme skew
Requires more extreme cutoffs
t distribution → normal distribution
as N →∞
• compare p < 0.1 entry for df = ∞ to z table
Example
•
•
•
•
•
•
•
•
•
•
•
Chapter 9, Set I, Problem 5, p334
Sign wrong in answers to Problems 5 and 6?
df = N – 1 = 4 – 1 = 3
tcutoff = -4.541 (page 642)
Differences: -7, -6, +1, -8
M = -20 / 4 = -5
SS = (-7- -5)2 + (-6 - -5)2 + (1 - -5)2 + (-8 - -5)2
= 4 + 1 + 36 + 9 = 50
S2 = SS / df = 50 / 3 = 16.7
SM = √ ( S2 / N ) = √ ( 16.7 / 4 ) = 2.04
t = ( M – μ ) / SM = ( -5 - 0 ) / 2.04 = -2.45
t(3) = -2.45, n.s., one-tailed
Assignment
• Monday
• Exam will cover Chapters 8 and 9
• No calculators, closed book/closed notes
•
•
•
•
Wednesday
Read Chapter 10 before class
Homework: Set I, Problem 3
Quiz at start of class
Exam 4 Prep
• Do homework
• Know theory
• Know glossary terms
Questions
• Don’t pack up to leave just yet
• Questions about lecture before we
dismiss?
• Post additional questions to discussion
electronic mailing list unless personal
– http://egroups.com/group/utd-statistics
Brief Recess
• 5 minute break
• Homework review when we return
• Attendance optional
Homework Review
• Chapter 9, Set I, Problem 1, p332
• df = N – 1
• Look up tcutoff from table (p642)
– Use lower df if not in table (conservative)
• SM = √ ( S2 / N )
• t = ( M – μ ) / SM
• Decide whether to reject null hypothesis