PHolmes-Smith - Naplan Data

Download Report

Transcript PHolmes-Smith - Naplan Data

ASSESSMENT FOR BETTER LEARNING
USING
NAPLAN DATA
Presented by
Philip Holmes-Smith
School Research Evaluation and Measurement Services
Overview
1. The National Scale
National Scale Scores or
Estimated VELS Equivalent Scores?
You can choose to look at
National Scale Scores or
Estimated VELS
Equivalent scores
What’s the Difference?
Understanding the National Scale
•
The National Scale is an arbitrary scale – at this stage it is not
related to points along a developmental curriculum. But, it is
highly likely that it will be mapped onto the National
Curriculum at some time in the future.
Understanding the National Scale
•
The National Scale is an arbitrary scale – at this stage it is not
related to points along a developmental curriculum. But, it is
highly likely that it will be mapped onto the National
Curriculum at some time in the future.
•
The National Scale was fixed in 2008 as follows:
900
800
700
+1 STD
600
500
2008
National
Mean
400
300
200
100
-1 STD
– Range: 0-1000
– Mean: 500
– Standard Deviation: 100 (i.e. 68% students between 400-600)
National Averages (2008)
Year
Level
Reading
Writing
Spelling
Grammar &
Punctuation
Numeracy
3
400.4
414.2
399.3
402.9
396.7
5
484.3
486.4
483.6
496.0
475.7
7
536.6
533.7
538.6
529.0
544.9
9
578.0
569.3
577.0
569.2
582.2
My School Website (2009)
School’s Average
National Average
Average for school’s
with similar ICSEA
Only 20.1% of Australian
school communities are
more advantaged
Understanding the National Scale
•
The National Scale is an arbitrary scale – at this stage it is not
related to points along a developmental curriculum. But, it is
highly likely that it will be mapped onto the National
Curriculum at some time in the future.
•
The National Scale was fixed in 2008 as follows:
– Range: 0-1000
– Mean: 500
– Standard Deviation: 100 (i.e. 68% students between 400-600)
•
The Scale has been divided into ten Bands which are used for
reporting to parents.
– Band 1 covers all scores equal to or less than 270.
– Bands 2 – 9 increment by 52 score points each Band
– Band 10 covers all scores above 686.
Parent Reports
Year 9
Year 7
Band 10
Band 9
Band 9
Band 8
Band 8
Band 8
Band 7
Band 7
Band 7
Band 6
Band 6
Band 6
Band 6
Band 5
Band 5
Band 5
Band 5
Band 4
Band 4
Band 4
Band 3
Band 3
Year 5
Year 3
Band 2
Band 1
Understanding the National Scale
Yr 9
Yr 7
Yr 5
•
The National Scale is an arbitrary scale – at this stage it is not
related to points along a developmental curriculum. But, it is
highly likely that it will be mapped onto the National
Curriculum at some time in the future.
•
The National Scale was fixed in 2008 as follows:
•
The Scale has been divided into ten Bands which are used for
reporting to parents.
– Range: 0-1000
– Mean: 500
– Standard Deviation: 100 (i.e. 68% students between 400-600)
– Band 1 covers all scores equal to or less than 270.
– Bands 2 – 9 increment by 52 score points each Band
– Band 10 covers all scores above 686.
Yr 3
•
At this stage Bands have no explicit curriculum meaning but
results show that for Victorian students in 2009:
–
–
–
–
A typical Yr3 level of performance is at the bottom of Band 5
A typical Yr5 level of performance is almost halfway into Band 6
A typical Yr7 level of performance is a third into Band 7
A typical Yr9 level of performance is at the bottom of Band 8
Victorian State Averages - 2009
(By Year Level and Dimension)
738
Band 10
686
Band 9
634
Band 8
587.9
582.0
Band 7
547.2
Band 6
506.4
582
530
579.1
580.3
541.4
540.4
544.8
497.6
495.7
511.1
596.3
Year 9
549.4
Year 7
496.3
Year 5
411.0
Year 3
478
Band 5
430.6
426
427.7
440.2
419.6
Band 4
374
Band 3
322
Band 2
270
Band 1
218
Reading
Writing
Spelling
Gram. & Punct.
Numeracy
Comparing National Scale Scores to
Estimated VELS Equivalent scores
• The Victorian means for Year 3 and Year 9 Reading and Numeracy on the
National scale are compared to the estimated VELS equivalent scores
below:
Year 3 Victoria Means
Dimension
Reading
Numeracy
Year 9 Victoria Means
National
Scale
VELS
Equivalent
National
Scale
VELS
Equivalent
430.6
411.0
2.37
1.83
587.9
596.3
5.28
4.88
• Compared to our expected curriculum outcomes for Year 3 students
(2.175), the State Reading mean is about 1½ term ahead of where we
expect a typical Year 3 student to be. However, the State Numeracy mean is
about 2½ terms below where we expect a typical Year 3 student to be.
• Compared to our expected curriculum outcomes for Year 9 students
(5.175), the State Reading mean is about one (1) term ahead of where we
expect a typical Year 9 student to be. However, the State Numeracy mean is
about 2½ terms below where we expect a typical Year 9 student to be.
Cautionary Note #1
• Equal scores amongst different dimensions
(on the National Scale) do not equate to
equal levels of performance in terms of
expected VELS levels.
• For example a National Yr9 Reading score of
587.9 is equivalent to a VELS score of 5.28
but a higher National Yr9 Numeracy score
of 596.3 is equivalent to a lower VELS score
of 4.88.
Cautionary Note #2
• Some “Estimated VELS Equivalent Scores” CAN
NOT be read as VELS scores.
• Specifically, it is doubtful that the “Estimated
VELS Equivalent Scores” for Writing or Spelling
are truly VELS scores*.
• On the other hand, “Estimated VELS Equivalent
Scores” for Reading and Numeracy appear to be
trustworthy*.
*As evidence, see following AIM/NAPLAN results
The National Minimum Standard
(against Victoria’s State Averages)
Year 9
Year 5
Band 8
Year 3
Band 6
Yr 5
Band 7
Yr 9
Yr 7
Band 6
Yr 5
Band 5
Band 5
Year 7
Band 10
Band 9
Band 9
Band 8
Band 8
Band 7
Yr 9
Yr 7
Band 7
Yr 9
Yr 7
Band 6
Yr 5
Band 6
Yr 5
Band 5
Yr 3
Yr 3
Yr 3
Band 4
Band 4
Band 3
Band 3
Yr 3
Band 4
Above National
Minimum Standard
Band 2
Band 1
Band 5
Very low levels of
performance. These
students are truly
“at risk”
At National
Minimum Standard
Below National
Minimum Standard
Even students at lower levels of
“above Minimum Standard”
may be “at risk”
Understanding Minimum Standards
• Yr 3, 5, 7 & 9 “Minimum Standards” are all very low (> 2 years below
expected level.). In fact they are so low I think they offer no
assistance in identifying “at risk” students so consider the following
alternative.
• Use the “Holmes-Smith” Minimum Standard instead – anyone 0.5 of
a VELS level below expected is in need of focused intervention or
additional support. For example:
– Yr5: Expected VELS score ~ 3.175
– Yr5: Holmes-Smith “low achiever” ~ 2.675 or lower
• Also anyone 0.5 of a VELS level above expected is in need of further
extension (to avoid boredom). For example:
– Yr7: Expected VELS score ~ 4.175
– Yr7: Holmes-Smith “high achiever” ~ 4.675 or higher
The School Summary Report
Choose “School
Summary Report”
Choose to look at
All Students or
Girls vs. Boys or
LBOTE Students or
ATSI Students
Choose to look at
National Scale Scores or
Estimated VELS scores
Click on “Preview
Report” to view results
The School Summary Report
There are 30 students in this Year Level.
Therefore:
–
–
50% (or 15 students) are above the median.
50% (or 15 students are below the median.
–
50% (or 15 students) are inside the “box”. Half
of these (7-8 students) are above the median
and half (7-8 students) are below the median.
–
–
–
–
10% (or 3 students) are at or below the 10th
percentile “whisker”.
10% (or 3 students are at or above the 90th
percentile “whisker”.
15% (or 4-5 students) are spread between the
25th down to the 10th percentile.
15% (or 4-5 students) are spread between the
75th up to the 90th percentile.
3 students on or above the
90th percentile. (Note, all we
know is at least one student
scored on the 90th percentile.)
4-5 students between the
75th up to the 90th
percentile.
7-8 spread above the median.
15 students inside the box.
7-8 bunched below the median.
15 students below median
•
15 students above median
Interpreting “box and whisker” graphs
4-5 students between the 25th
down to the 10th percentile.
3 students on or below the 10th
percentile. (Note, all we know
is at least one student scored
on the 10th percentile.)
Interpreting the School Summary Report
1. Is the school’s median
above, at or below the
State median?
School’s median is about
half a band below the
State median
Interpreting the School Summary Report
1. Is the school’s median
above, at or below the
State median?
2. Is one or more
dimension very
different from the
other dimensions?
School’s median is about
half a band below the
State median
Writing and Spelling are lower than Reading
and Grammar & Punctuation
Interpreting the School Summary Report
1. Is the school’s median
above, at or below the
State median?
2. Is one or more
dimension very
different from the
other dimensions?
3. How does the school’s
spread compare to the
State spread?
School’s median is about
half a band below the
State median
Writing and Spelling are lower than Reading
and Grammar & Punctuation
School has far fewer high
performers than the State
School’s median is at the
State’s 25th percentile
School has far more low
performers than the State
The School Summary Report
Interpreting the
“Standard Error of the Mean” - se(mean)
Your reported school mean simply reflects the performance of the students who
were present on the day and how they felt on that day. What if some of the
smartest students had been away? What if there had been a bad accident in the
schoolyard just prior to the test and the students’ minds weren’t 100% on task?
Differences in results due to such events are referred to as measurement error.
State
Mean
School
Mean
Standard
error of the
Mean
Interpreting the
“Standard Error of the Mean” - se(mean)
Your reported school mean simply reflects the performance of the students who
were present on the day and how they felt on that day. What if some of the
smartest students had been away? What if there had been a bad accident in the
schoolyard just prior to the test and the students’ minds weren’t 100% on task?
Differences in results due to such events are referred to as measurement error.
Statistically, we can allow for such errors in measurement by building a
“confidence interval” around the reported school mean using the “Standard error
of the mean” – se(mean).
We can be 95% confident that the true school mean for Reading is no lower than
488.1 – 1.96 * 7.6 (473.2) and no higher than 488.1 + 1.96 * 7.6 (503.0).
That is, the true school mean is somewhere between 473.2 – 503.0.
State
Mean
School
Mean
Standard
error of the
Mean
Interpreting the
“Standard Error of the Mean” - se(mean)
Your reported school mean simply reflects the performance of the students who
were present on the day and how they felt on that day. What if some of the
smartest students had been away? What if there had been a bad accident in the
schoolyard just prior to the test and the students’ minds weren’t 100% on task?
Differences in results due to such events are referred to as measurement error.
Statistically, we can allow for such errors in measurement by building a
“confidence interval” around the reported school mean using the “Standard error
of the mean” – se(mean).
We can be 95% confident that the true school mean for Reading is no lower than
488.1 – 1.96 * 7.6 (473.2) and no higher than 488.1 + 1.96 * 7.6 (503.0).
That is, the true school mean is somewhere between 473.2 – 503.0.
Now, because the State mean for reading (506.5) is above the highest estimate of
the school reading mean we conclude that the school mean is significantly below
the State mean.
State
Mean
School
Mean
Standard
error of the
Mean
The Five Year Trend Report
Choose “Five Year
Trend Report”
Choose one dimension
The Five Year Trend Report
The Five Year Trend Report
The Five Year Trend Report
Using the Five Year Trend Report
• Use the Five Year Trend Report to determine whether:
– this year’s result was a one-off or whether it is consistent
with previous results,
– the trend over time is showing an improving trend, a steady
trend or a declining trend relative to the State.
• Remember, however, each year’s data comes from a
different cohort of students and in some years, students
are just much better or much worse than the typical
cohort of students.
• Remember also that for small cohorts (< 10 students), a
few extra high performing students can significantly
increase your school mean. Likewise, a few extra low
performing students can significantly decrease your
school mean.
Analysing extracted data in SPA
A-E Grades:
Year 3
A-E Grades:
Year 7
Year 3 – Year 5 Growth
The Item Analysis Report
Roughly equal numbers selecting each
of the wrong answers = guessing
Nearly all students who got this wrong gave the
same wrong answer = common misconception
Year 3 Reading (Q21)
Year 5 Reading (Q9)
Correct = ?
Most common incorrect = ?
Year 3 Reading (Q21)
Year 5 Reading (Q9)
Correct = B
(Yr3 – 36%; Yr5 – 56%)
Most common incorrect = C
(Yr3 – 49%; Yr5 – 34%)
WHY?
Year 3 Numeracy
Correct = ?
Most common incorrect = ?
Year 3 Numeracy
Correct = B
(60%)
Most common incorrect = A
(28%)
WHY?
Year 7 Numeracy
Correct = ?
Most common incorrect = ?
Year 7 Numeracy
Correct = B
(55%)
Most common incorrect = A
(26%)
WHY?
Zone of Proximal Development
The Student Response Report
(Reading or Numeracy – Difficulty Order)
Choose “Student
Response Report”
Choose “Reading or
Numeracy –
Difficulty Order”
The Student Response Report
Increasing Level Ability
Data sorted by Item Difficulty and Student Ability
Increasing Level of Difficulty
Zone of Proximal Development (Vygotsky)
Increasing Level Ability
The known:
What students can already
do independently.
The unknown:
What students are
incapable of learning before
prior concepts are taught.
Increasing Level of Difficulty
Summarising
strengths and weaknesses
Year 3 Reading – The Known
(What students can already do independently)
Year 3 Reading – The Unknown
(What students are incapable of learning before prior concepts are taught)
Year 3 Reading – The Zone of Proximal Development
(What students are capable of learning with the assistance of explicit
instruction from the teacher [scaffolding])
The Writing Criteria Report
Choose “Writing
Criteria Report”
Click on “Preview
Report” to view
results
The Writing Criteria Report
The Writing Task
The Writing Criteria Report
The Writing Marking Rubric
The Writing Marking Rubric
The Writing Marking Rubric
The Writing Marking Rubric
The Writing Marking Rubric
Interpreting the Writing Criteria Report
In this school, about
20% of students
received a score of “1”
but the majority
(nearly 60%) received
a score of “2”.
To improve, the
teacher needs to move
the 1s onto 2s, the 2s
onto 3s, etc.
Assessment as Learning
Students writing like a “1” could be shown examples of how they are currently
writing (Dungaun, The casel, BMX, etc.) and shown examples of what is now
expected of them to improve to a “2” (Living dead, Woodern box, etc.)
Another assessment as learning example
Using the
“Paragraphing” rubric
and accompanying
sample scripts, a
student could be
shown that their
writing demonstrates
no paragraphing.
More importantly, the
rubric shows what is
expected next and
gives examples that
students could read
to get an idea of what
writing in paragraphs
looks like.
The Student Response Report
(Writing Test – by criteria)
Choose “Student
Response Report”
Choose “Writing
Test – by criteria”
The Student Response Report (Writing)