Discussion “2%” Modified Achievement Assessments
Download
Report
Transcript Discussion “2%” Modified Achievement Assessments
2% Modified
Achievement
Assessments
Overview
(C-5)
Ohio Department of Education
Office of Assessment
September 2008
09-05-08
1
Federal Rule on
Modified Achievement Standards
2
•
Known as 2% Modified Assessment
•
Allowances
o
Modified assessment of academic content standards
o
Proficient and above scores from such an assessment
may be used in AYP accountability calculations capped at
2% of total test population
•
Similar to the 1% alternate achievement standards for
students with cognitive disabilities, but with some
significant differences
Final Federal Rule (April 2007)
•
The intended population for this assessment is the
students on IEPs who are persistently low performing.
•
Modified academic achievement standards must be
based on a state’s grade-level academic content
standards for the grade in which an eligible student
with disabilities is enrolled.
•
A state’s academic content standards are not what is
modified.
The expectations for whether a student has mastered
those standards, however, may be less difficult than
grade-level academic achievement standards.
(pp. 17748, 17749)
3
Who the Students Are
The student’s progress to date in response to
appropriate instruction…, is such that, even if
significant growth occurs, the IEP team is reasonably
certain that the student will not achieve gradelevel proficiency within the year covered by the
student’s IEP.
Section 200.1(e)(2)(ii)
4
Cook and Elliot,
CCSSO Large-Scale Session, 2007
5
Cook and Elliot,
CCSSO Large-Scale Session, 2007
6
Three-State Consortium
Minnesota, Ohio and Oregon are
working with AIR on the awarded
three-year GSEG grant.
7
The Goals for the Consortium
8
•
Identify the appropriate IEP student group
•
Identify the appropriate test question design, the test format
and an administration procedure
•
Develop 2% Modified Assessments for grades
5–8 in reading and mathematics
•
Set modified achievement standards
•
Provide professional development for the assessment
•
Define eligibility guidelines for participation in the assessment
•
Monitor implementation
State Outcomes from the Consortium
9
•
Each state takes away modifications that will
work in that state.
•
Each state will move forward with the
AA-MAS, using its own item-review process.
•
Each state will determine its own blueprint for
the AA-MAS.
•
Each state will conduct its own standard
setting.
•
Each state will conduct its own alignment
study.
Consortium Outcome for Ohio
Ohio Actions:
•
•
•
•
•
10
Modify existing and write new items for the general
education assessment for use on the AA-MAS
Use the general education assessment blueprints
for the AA-MAS
Use the currently established review process for
the items used on the AA-MAS
Conduct alignment studies for each AA-MAS
Conduct a standard setting resulting in modified
achievement standards based on grade-level
academic standards
First Year Tasks
11
Task 1.
Conduct initial and subsequent meetings
Task 2.
Define the student population, conduct data analysis,
conduct focus groups, develop student profiles
Task 3.
Draft Performance Level Descriptors
Task 4.
Develop Eligibility Guidelines and Decision Guide
Task 5.
Begin professional development plans for
o
Eligibility guidelines
o
Standards-based IEPs
Task 6.
Select modification strategies, conduct literature
reviews, define modification strategies, modify items
and forms
Task 7.
Conduct pilot tests for item modifications
Second Year Tasks
12
Task 1.
Conduct subsequent meetings
Task 3.
Create initial definitions of modified
achievement standards, refine PLD, draft
definitions of modified achievement standards
Task 5.
Implement professional development for
o
Eligibility guidelines
o
Standards-based IEPs
Task 7.
Conduct pilot test of item modifications
Task 9.
Produce standard setting plans
Third Year Tasks
13
Task 1.
Conduct subsequent meetings
Task 5.
Implement professional development plans for
o
Instructional strategies
o
Monitoring
Task 7.
Field test modifications to items, forms and
administration as well as validation of field
tests
Task 8.
Produce item maps for operational test
Research
Marion, Scott. A Technical Design and Documentation Workbook for Assessments
Based on Modified Achievement Standards.
KS, SD, OK, and MD Department of Education Web sites.
Higgins, J., Russell, M., & Hoffmann, T. (2004). Examining the Effect of ComputerBased Passage Presentation on Reading Test Performance.
Miranda, H., Russell, M., & Hoffmann, T. (2004). Examining the Feasibility and
Effect of a Computer-Based Read-Aloud Accommodation on Mathematics Test
Performance.
Famularo, L. and Russell, M. (2007). Examining the utility of a prototype
assessment for assessing students in the gaps.
Dolan, R., Murray, E., & Burling, K. (2007). Providing students with choice: An
exploratory study on the application of universal design principles to large-scale
assessment of students with learning disabilities and English language
learners.
The Assessment and Accountability Comprehensive Center: Assessments based
on modified academic achievement standards: critical considerations and
implications for implementation.
National Centers on Educational Outcomes. (2007). The assessment and
accountability comprehensive center: special populations strand.
National Centers on Educational Outcomes. (2007). A Seven-Step Process to
Creating Standards-based IEPs.
14
Initial Data Mining
Analyze the data to help with the
following:
•
•
15
Help define the student population
Examine questions that work with the
target population (still undefined)
Initial Data Mining Procedures
Data from the three states’ general
education administration were divided into
four sets:
•
•
•
•
16
Persistently low performing
Persistently low performing with
IEPs
Other students with IEPs
General education students
Initial Data Mining—Research Questions
17
•
What items discriminate well for both IEP
populations?
•
What are the characteristics of the items
that discriminate well for the lowperforming group?
•
What is the breakdown by
race/ethnicity/gender/SES for the lowperforming group?
Initial Data Mining—Research Questions
18
•
What are the disability classifications in the
persistently low-performing groups?
•
What happens when we include those
students who top out of the AA-SWD?
•
What are the performances of students
allowed different amounts of time in the
regular classroom rather than disability
classifications?
Stakeholder Input
•
•
19
Focus groups
o Collect input from the special
education community who work
with the target student population
o Aggregate information across
states
Public survey
o Eligibility guidelines
o Decision flowchart
Goals for the Focus Group
•
•
•
•
20
Begin defining the target population
o
Brief review of what the data show about low-performing
students
o
Professional judgment about who these students are
Review and suggest modification to test questions
o
Review questions that work well with the persistently lowperforming students
o
Suggest modifications to test questions
Review and suggest formats for test booklets
Suggest modifications to administration procedures
Requirements for Targeted
Population
Students must conform to the
following minimum requirements:
•
•
•
21
Have IEPs
Identified as persistently low performing
on the regular statewide assessments
Receive on-grade-level instruction
Eligibility
Decision
Flowchart
22
(Handout)
Stakeholder Input—Online Survey
Results of the survey:
• Who responded
• Requirements for eligibility
• Training needs
23
Ohio Fall 2008 Pilot Goal
•
•
24
Purpose: lower barriers that students with disabilities
face in demonstrating achievement of grade-level
skills and knowledge
Two strategies combine to meet this goal
o The assessment incorporates modifications
targeting specific characteristics commonly found
in the target population
o Second, performance on the assessment will be
judged against modified achievement standards
Grade 7 Fall 2008 Pilot (Cont.)
25
•
Two groups tested: target and general
education students
•
Each student takes Reading and Mathematics
tests
•
One consumable test booklet
•
Seven forms, each with 42 multiple-choice
questions
•
Week of October 20, 2008, for Reading
•
Week of October 27, 2008, for Mathematics
Grade 7 Fall 2008 Reading Pilot
Item modifications:
1. Simplify language where appropriate
2. Inserted question within a box at appropriate
place in passage
3. Present summary questions at the beginning
and end of passages
4. Use bold print for key words or phrases
26
Grade 7 Fall 2008 Mathematics Pilot
Item modifications:
1. Simplify language where appropriate
2. Include relevant pictures, tables and graphics to
replace text (this is different from enlarging or
shrinking graphics)
3. Enlarge and/or shrink graphics
4. Bold key words
5. Use scaffolding: Break multi-step items into
individual steps, each with questions
27
Caveats
•
•
•
28
Not every item will be modified in every
way.
In cases where the modification is not
possible, the unmodified version of the
item for the block will be used.
Examples might include:
o An item with a one-step process for the
solution may not be modified using
scaffolding.
o An item with no original graphic may not
be modified by using enlarged or
simplified graphics.
Analysis Questions for Fall 2008 Pilot
Do the modifications improve access?
29
•
Target group will omit fewer
modified items
•
Target group will omit more
unmodified items
Analysis Questions for Grade 7 Fall
2008 Reading and Mathematics Pilot
Does performance exceed chance?
30
•
Target group will guess less on
modified items
•
Target group will guess more on
unmodified items
Analysis Questions for Fall 2008 Pilot
31
•
Does performance improve and is discrepancy in
performance reduced?
•
Target group will perform better on modified items than on
the unmodified items.
•
Regular education students will perform about the same on
both modified and unmodified items.
•
Marginal maximum likelihood will be used for group
comparisons rather than aggregating individual maximum
likelihood estimates for each student.
•
Multi-group IRT model that allows calibration around
multiple groups that is a partial credit extension of the Bock
& Zimowski, 1997.
Grade 7 Spring 2009 Field Test
•
•
•
•
32
Will be similar to the fall 2008 pilot
Interested in participating?
Contact Barry Lowry:
[email protected]
District name, IRN, school, and
contact information
Spring 2009 Pilot—Reading Modifications
Item modifications:
1. Include relevant picture
2. Scaffolding: priming—asking
understanding questions
3. Scaffolding: cueing with focusing
questions
4. Scaffolding: graphic organizers
5. Scaffolding: provide thoughtful questions
through passages focusing on
summative items
33
Spring 2009 Pilot—Mathematics
Modifications
Item modifications:
1. Include relevant pictures, tables, graphics that
replace text
34
2.
Scaffolding: priming—asking understanding
questions
3.
Group questions by content strand
Thank You.
Questions?
35