Setting the Goals and Knowing the Improvement of All Students

Download Report

Transcript Setting the Goals and Knowing the Improvement of All Students

Progress
Monitoring
Of course we also already have:
Classroom assessments
 End of unit tests
 Homework completion info
 Attendance records
 Discipline referrals


We have tons of data
– but that doesn’t
mean we have
information that
informs.

Much of our data is
not scientifically
based, and cannot be
compared
Testing vs Measurement
Both assess but difference is
based on purpose
Testing
judges
 summative
Measurement
 formative
informs
A
Data That Judges vs
Data That Informs
Parents/
Fans
Community
A B
Student
Players
A
Administration
Owner
Basketball
SchoolTeam
B A
Teacher
Coach
DATA
DATASet
Set“A”
A
Grades
Won/Lost
Record
DATA
DATASet
Set“B”
B
Team/Individual
DIBELS
Statistics
Data that Judges
Data that Informs
Features of Effective Instruction

Use data that INFORMS for:
 Grouping
 Planning
instruction
 Delivering targeted instruction and
intervention to address students’ instructional
needs
 Monitoring student progress toward gradelevel standards/benchmarks
What makes it a Core/Basic
Skill?

Predictive of later achievement

Something we can do something
about…we can teach it

Something that improves outcomes for
students if we teach it
Steps
for
Successful
Readers
(Roland Good)
Probability: On-Track
.81 (n=196) Fluency with
Connected Text
(Spring, 3rd)
Probability: On-Track
We need to have
the odds with us!
.83 (n=246)
Probability: Catch-Up
Fluency with
Connected Text .06 (n=213)
(Spring, 2nd)
Probability: On-Track
.86 (n=138) Fluency with
Probability: Catch-Up
Connected Text .03 (n=114)
(Spring, 1st)
Probability: On-Track
.64 (n=348)
Phonemic
Awareness
(Spring, Kdg)
Alphabetic
Principle
(Winter, 1st)
Probability: Catch-Up
.22 (n=180)
Probability: Catch-Up
.17 (n=183)
Probability of remaining an average reader in fourth grade
when an average reader in first grade is .87
Probability of remaining a poor reader at the end of fourth grade
when a poor reader at the end of first grade is .88 (Juel, 1988)
For Data To Be Useful

Assessment must be
 Reliable
 Valid
 Efficient
DIBELS Oral Reading




Student reads aloud for 1 minute from each of
3 separate reading passages
While student reads, examiner marks errors
Calculate the number of correctly-read words
(CRW) per minute and number of errors
Median score is used as the student’s reading
rate.
 (there
are also pre-reading measures)
DIBELS is used for:

To identify at-risk students who may need additional
services

To help teachers plan more effective instruction within
their classrooms

To help teachers design more effective instructional
programs for students who don’t respond to the general
education program

To document student progress for accountability
purposes

To communicate with parents or others professionals
about students’ progress
DIBELS

Current levels of performance is measured

Goals are identified

Progress is measured on a regular basis
(weekly or monthly). Compare expected versus
actual rates of learning.

Based on these measurements, teaching is
adjusted as needed.
Taking it a step further
Using assessment to develop
interventions
 Survey Assessments
 Teaching students to use it for Peer
Assisted Learning
 and more…

Top-Down Processing
1st Phase
School-Wide and
Grade-level team level
2nd Phase
Classroom or Special
Group level
3rd Phase
Individual student level
Data-Driven Instructional
Decision-Making

Involves using assessment data to
determine your school’s current status:
 What’s
working
 What’s not working
 How did different sub-groups (economically
disadvantaged, racial and ethnic groups,
students with disabilities or with limited English
proficiency) score?
 What actions are needed to improve classroom
instruction and student outcomes?
Do you know where you’re going?
Grade Level Analysis
This should be accomplished through
grade level meetings
 Teachers and staff need to time to look at
the data and make decisions
 Helps to have a facilitator and an agenda
 Focus on the data

Questions to Ask:
What percentage of students will be at
benchmark at the next school-wide
assessment?
What will you do to be sure all students’
are instructed at their level?
2nd Grade Mid Year 2006-2007
71% = Low Risk (31 students)
13% = Some Risk (7 students)
15% = At Risk (8 students)
Grouping
Form
Classroom #1
Intensive
NONE!
Strategic
Lizzy
Travis
Mandy
Greg
Henry
Jarod
54
55
59
64
64
65
Benchmark
21 Students – Scores Ranging from
74 to 152 Words Read Correct
Classroom #2
Intensive
Randy
Josh
Paul
Marsha
Carrie
Joey
Ross
Betsy
8
10
11
30
30
31
49
50
Strategic
Nakia
67
Benchmark
16 Students – Scores Ranging from
74 to 176 Words Read Correctly
 Teachers
determine needs
 Interventions are chosen
 Additional problem-solving happens as needed
** Remember – We’re looking across the grade-level.
How can we combine kids and combine our
effectiveness
2nd Grade Problem Solving


Benchmark – Core Program
Strategic and Higher Level Intensive Students
 Teacher
Directed Pals
 Read Naturally

Intensive Students
with 1st grade students
 Reading Mastery
 Read Well
 SIPPS
2nd Grade Problem Solving

Classroom #2
 Additional
Paraprofessional Time
 Additional Behavior Intervention time from
Social Worker
Need to Watch the Progress

Teachers discuss at monthly grade-level
meetings what is working and what is not

Return to the data after each
benchmarking and make decisions
Beginning of the
Year Data
Middle of the Year
Data
Benchmark: 38% (n=22)
Benchmark: 56% (n=30)
Some Risk:
22% (n=13)
Some Risk:
26% (n=15)
At Risk:
40% (n=23)
At Risk:
22% (n=13)
Effectiveness Graph
Top-Down Processing
1st Phase
School-Wide and
Grade-level team level
2nd Phase
Classroom or Special
Group level
3rd Phase
Individual student level
Classroom or Special Group
Analysis

Are there certain student groups that are
not making progress?

Is there a certain Tier or a certain
population that are not making gains?
How are our ESL students doing?
Individual Student Level
Intervention
Intensive Individual Interventions
•Individual Students
•Assessment Based
•High-Intensity
•Of longer duration
Targeted Group Interventions
•Some students (at-risk)
•High efficiency
•Rapid response
Universal Programming
•All Students
•Preventative
•Pro-active
Progress Monitoring
Intensive
•Bi-Weekly
Strategic
•Monthly
Benchmarking
•3 Times a Year
Is this student making adequate
progress?
Is this student making adequate
progress?
Is this student making adequate
progress?
Vocabulary and Language Development
Reading
Comprehension
Phonological
Awareness
Initial
Sound
Fluency
Phoneme
Segment.
Fluency
Alphabetic
Principle
Nonsense
Word
Fluency
Accuracy and
Fluency with
Connected Text
Oral
Reading
Fluency
ORF,
Teacher-Made
Assessments/
Observations
What kind of progress can we
really expect?
How?
Use research based programs
 Make ambitious goals
 Track progress
 Make changes when needed

Benefits of Progress Monitoring



Clear visual representation of progress.
Common understanding between teachers,
parents, psychologists, administrators.
Student can track/follow their own progress
** Increases Communication **

Evaluates success of programs
Best Practice
Monitor Students at Grade Level as Often
as Possible
 Out-of-Grade Monitor When You Need To
Have:

 Better
Information For Decision Making
General Guidelines
Students should be at about the 20th
percentile to be monitored at that level
 Once students begin to reach goal for that
grade level, move up

 When
moving up to the next level, get 2-3
data points at both levels so you can continue
to watch the student’s trend.
http://brt.uoregon.edu/techreports/
ORF_90Yrs_Intro_TechRpt33.pdf
Out of Grade Level
Student Example:
What teachers have to say:

“I love the chance to work with kids in reading one on one. It's
exciting to see the progress in individuals. The kids are proud and
are really shining!”
- Sue Craig 4th grade teacher

“I like the uniformity and consistency of the test. It’s great to watch
the kids’ abilities grow! For those that aren’t growing – helps with
intervention strategies.”
- Coleen Vader, 5th grade teacher

“I don’t mind the process of progress monitoring. I do mind the
huge differences in the oral reading fluency passages. It makes
the “progress” go up and down, up and down.”
- Jane Mazei, Title One Teacher
What teachers have to say:

“DIBELS is easy and quick to administer. The graphs
are very helpful and parent friendly.”
- Denise Cardenas, 1st grade teacher

“I really like the one minute DIBELS assessments. They
give me a quick indication of where my students are at
and what areas they need to work on yet.” - Shelly
Modreske, 1st grade teacher

“The one-on-one approach to testing is a wonderful tool
in monitoring students. I really enjoy that aspect of this
program and also the excitement on my students faces
when they see their great progress!” – Anonymous
Barriers to Data Use







Lack of training in data use
No uniform data collection
Lack of leadership at the school and
district level
Outdated technology
Unclear priorities and goals
Lack of teamwork
Distrust of data use
What gets measured gets done.
Peters 1987
One thing in common
In all these successful programs, one strategic
requirement emerges: The teacher is “the
essential force for improving student
achievement.”
Solmon & Schiff, “Talented Teachers,” 2004