VGSD SLO PowerPoint - Rocky Grove High School
Download
Report
Transcript VGSD SLO PowerPoint - Rocky Grove High School
VALLEY GROVE SD
2014 ~ SLO Workshop
Student Learning Objectives
(SLO)
PPT SOURCE:
• Dr. Cathleen Cubelic
[email protected]
Our Objectives
Understand what is an SLO
Understand Process: Design, Build, & Review
Consider Assessment Quality and Purpose
Examine Webb’s DOK in reference to
Assessment
Collaborate for implementation
Build the SLO on the Template
Use online tools
2
SLO & Assessment Literacy
Pre-Test
3
Student Learning Objectives
MIND DUMP
What do you know?
What have you heard?
What have you researched?
Why are we doing this?
Anything else?
4
Student Learning Objectives
YOUR SLO…
…written specific to you and a specific
class/course/content area for which you teach.
Every teacher designs one.
Collaborative development is encouraged.
Design, Build, Review – Repeat next year/cycle
In the interest of all students
To improve the program
Knowing the good teaching matters most
5
Teacher Effectiveness System in Act 82 of 2012
Building Level Data/School Performance Profile
Teacher Observation & Practice
Effective 2013-2014 SY
Danielson Framework Domains
1. Planning and Preparation
2. Classroom Environment
3. Instruction
4. Professional Responsibilities
Effective 2013-2014 SY
Indicators of Academic Achievement
Indicators of Closing the Achievement Gap, All Students
Indicators of Closing the Achievement Gap, Subgroups
Academic Growth PVAAS
Other Academic Indicators
Credit for Advanced Achievement
Teacher Specific Data
PVAAS / Growth 3 Year Rolling Average
1. 2013-2014 SY
2. 2014-2015 SY
Teacher Specific 3. 2015-2016 SY
Data, 15%
Other data as provided in Act 82
Building Level
Data, 15%
Observation/
Practice, 50%
Elective Data/SLOs
Optional 2013-2014 SY
Effective 2014-2015 SY
Elective
Data, 20%
6
District Designed Measures and Examinations
Nationally Recognized Standardized Tests
Industry Certification Examinations
Student Projects Pursuant to Local Requirements
Student Portfolios Pursuant to Local Requirements
Teacher Effectiveness System in Act 82 of 2012
Building Level Data/School Performance Profile
Teacher Observation & Practice
Effective 2013-2014
Danielson Framework Domains
1.
2.
3.
4.
Planning and Preparation
Classroom Environment
Instruction
Professional Responsibilities
Effective 2013-2014 SY
Indicators of Academic Achievement
Indicators of Closing the Achievement Gap, All Students
Indicators of Closing the Achievement Gap, Subgroups
Academic Growth PVAAS
Other Academic Indicators
Credit for Advanced Achievement
Building Level
Data, 15%
Elective Data/SLOs
Observation/
Practice,
50%
7
Optional 2013-2014 SY
Effective 2014-2015 SY
Elective Data,
35%
District Designed Measures and Examinations
Nationally Recognized Standardized Tests
Industry Certification Examinations
Student Projects Pursuant to Local Requirements
Student Portfolios Pursuant to Local Requirements
The Rating Tool
PDE 82-1
2014-15
…PVAAS
Rostering
8
The SLO in PA is written in
relationship to a specific
teacher and a specific
class/course/content area
for which that teacher
provides instruction.
9
“The PSSA test doesn’t completely
measure my effectiveness.”
≠
SLO CONCEPTS
STUDENT ACHIEVEMENT can be measured in ways that reflect
authentic learning of content standards.
EDUCATOR EFFECTIVENESS can be measured through use of
10
student achievement measures
SLO Definition
A process to
document a
measure of educator
effectiveness
based on student
achievement of
content standards.
11
SLO Process
The SLO process contains three (3)
action components:
1. Design (ing): thinking,
conceptualizing, organizing,
discussing, researching
2. Build (ing): selecting, developing,
sharing, completing
3. Review (ing): refining, checking,
updating, editing, testing, finalizing
12
Student Learning Objectives
Components
Goal Statement
– “big idea” of what the SLO is based on
Endurance – Learning has worth beyond the assessment
Leverage – Content has value across disciplines
Readiness – Provides knowledge/skills necessary for success at future levels of
instruction
Performance Measures
– Assessments used to measure student
achievement
Performance Indicators
– Articulated target for student
achievement
Effectiveness Rating
– Translation of number of students
meeting performance Indicators
13
How many met target and what does that mean?
Student Learning Objectives
Assessment Literacy
14
vs.
When we think about how we are changing education today, we are moving from a
system that focuses on inputs to one that focuses on outputs. In an input world,
what we care about for integrity of curriculum is making sure that all of our teachers
are giving children exactly the same thing. This is a Betty Crocker curriculum. Betty
Crocker has some fantastic recipes and we want to make sure that the boxes of cake
always produce the same outcome. That’s what education has been. You get a
publisher and they say here are the resources, follow the instruction to the letter and
that is Input Integrity.
Assessment changes all that. Assessment is about output integrity.
Did the kid learn what he needed to learn? How does that make it different?
When we think about outputs, we have to change all those input factors. Betty
Crocker doesn’t help us; the recipe isn’t the guide. The assessment tells us where
we need to add a little salt and where we need a little sugar, and where do we need
to change what we’re making altogether. Formative assessment and summative
assessment give us information about how successful we are, that we need to use in
a different way to look at curriculum and instruction integrity, and build upon what
we have done previously...adapting and changing in the name of
improvement.
Student Learning Objectives
Assessment Literacy
15
What is RIGOR?
Rigor in the classroom
Rigor is creating an environment in which
each student is expected to learn at
high levels, each student is supported so
that he or she can learn at high levels, and
each student demonstrates learning at high
levels.
-Barbara Blackburn, 2008
Rigor can be accomplished by:
Increasing the complexity of thinking in…
Course content – learning progressions and
appropriate leveled text for challenge
Instruction – activities promote critical thinking,
communication building, applying integrated ideas,
application of concepts, promoting responsibility
Assessment – aligned to instructional targets,
engages with academic content, requires extended
and elaborated responses.
Bloom’s Taxonomy
Old (1950s)
New (1990s)
HANDOUT: the laminated charts show you a comparison of
BLOOM’s TAXONOMY with WEBB ’S DEPTH OF KNOWLEDGE.
COMPARISON
BLOOM’s KEY POINTS:
WEBB’s KEY POINTS:
6 levels
The DOK is NOT determined by the
verb (Bloom’s) but by the context in
which the verb is used and in the
depth of thinking that is required.
Different sources list different verbs
The same verbs appear as examples
in more than one cognitive level
This overlap indicates that focusing
ONLY on verbs to determine what is
the level of cognitive demand is not
fully adequate.
Names 4 different ways students
interact with content.
Each level is dependent upon how
deeply students understand the
content
DOK is about what follows the verb...
What comes after the verb is more
important than the verb itself…
“Analyze this sentence to decide if the commas have
been used correctly” does not meet the criteria for
high cognitive processing.
The student who has been taught the rule for using
commas is merely using the rule.
Same Verb – 3 different DOK levels
DOK 1- Describe three characteristics of metamorphic rocks.
(Requires simple recall)
DOK 2- Describe the difference between metamorphic and
igneous rocks. (Requires cognitive processing to determine
the differences in the two rock types)
DOK 3- Describe a model that you might use to represent
the relationships that exist within the rock cycle. (Requires
deep understanding of rock cycle and a determination of
how best to represent it)
DOK is about intended outcome,
…not difficulty
DOK is a reference to the complexity of
mental processing that must occur to
answer a question, perform a task, or
generate a product.
• Adding is a mental process.
• Knowing the rule for adding is the intended outcome that
influences the DOK.
• Once someone learns the “rule” of how to add, 4 + 4 is DOK 1
and is also easy.
• Adding 4,678,895 + 9,578,885 is still a DOK 1 but may be more
“difficult.”
WEBB’S DOK RESOURCES
Online Search – tons of resources…
Laminated Charts – Webb’s vs. Bloom’s
Handout DOK #1– Levels Described
Handout DOK #2 – Subject Area Info
Handout DOK #3 – Question Stems
Activity: Question Analysis
Math – Trip to the Capital
ELA – Women Poem
23
SLO Process Components
DESIGN
• Thinking about what content
standards to measure
• Organizing standards and measures
• Discussing with colleagues
collective goals
• Researching what is needed for a
high quality SLO
24
SLO Process Components
BUILD
•
•
•
•
•
Selecting the performance measure(s)
Developing targets and expectations
Completing the template
Sharing the draft materials with other
colleagues
Developing/Documenting
performance task(s)
25
SLO Process Components
REVIEW
• Checking the drafted SLO (including the
performance measures for quality
• Refining measures and targets
• Editing text and preparing discussion
points/highlights for principal
• Finalizing materials
• Updating completed SLOs with performance
data
26
Design
What is a Goal Statement?
Definition:
• Narrative articulating the “big idea” upon
which the SLO is built under which content
standards are directly aligned.
Characteristics:
•
•
•
28
ENDURANCE: Encompasses the “enduring
understanding” of the standard…beyond the test
LEVERAGE: Central to the content area…but has
value in other disciplines
READINESS: Foundational concepts for later
subjects/courses … necessary to the next step
Goal Statement Example
• “Students will apply the concepts
and the competencies of nutrition,
eating habits, and safe food
preparation techniques to overall
health and wellness throughout the
life cycle at individual, family and
societal levels.”
29
SLO Goal
(Template #1)
Goal Statement addresses:
•
WHAT the “big idea” is in the
standards
Standards
•
HOW the skills and knowledge
support future learning
Rationale Statement:
•
30
WHY the “big idea” is a central,
enduring concept
http://pdesas.org/standard/PACore
More Considerations for
Goal Statements
Do you have previous data to help
guide your goal?
What does your growth and
achievement look like?
Is there a building/district-wide
goal?
31
Activity:
Goal Statement (Template #1)
Within your team, choose a discipline in
which you’d like to focus. Preferably,
choose a discipline that is very familiar to
you.
Complete “Template #1 Goal Statement”
We will post them for the entire group.
32
Build
Template
Section 1
34
Goal
Goal statement should articulate an
appropriate “big idea”.
http://pdesas.org/standard/PACore
Standards should be the appropriate Focus
Standards supporting the goal.
Rationale statement should be reasons why
the Goal statement and the aligned Standards
address important concepts for this
class/course.
35
Template
Section 2
36
Performance Indicator
Definition: a description of the expected level of
student growth or achievement based on the
performance measure
***Articulates Targets for each Performance Measure***
Answers two questions……….
1)Does the indicator define student success?
2)What is the specific measure linked to the indicator?
37
Examples of Performance
Indicator Targets
Students will achieve Advanced or Proficient on all four criteria
of the Data Analysis Project rubric.
Students will score an average of 3 or better on five different
constructed response questions regarding linear modeling
according to the general description of scoring
guidelines.(http://static.pdesas.org/Content/Documents/Keysto
ne%20Scoring%20Guidelines%20-%20Algebra%20I.pdf)
Students will improve a minimum of 10% points from pre- to
post-test for material in each semester.
Students will show “significant improvement” in the Domain of
Measurement on the Classroom Diagnostic Tools Mathematics
Grade 7 assessment from the first to the last administration.
38
Performance Indicator –
Focus student group
A description of the expected level of achievement for
each student in a subset of the SLO population (1F)
based on the scoring tools used for each performance
measure (4A).
Subset populations can be identified through prior
student achievement data or through content-specific
pretest data.
39
Examples of Performance
Indicator Targets: Focused
Student Group
Students who scored below the 30th percentile on their
benchmark AIMSweb R-CBM probe will score above the
30th percentile by the end of the school year using the
national norms.
Students who scored below a 2 on the pre-test will
improve a minimum of one level on the post-test.
40
SLO Design Coherency
R
A
T
I
N
G
41
Goal Statement
~ Focus Standards
Performance Indicator(s)
Performance Measure(s)
All Students
Targeted Students
Activity:
Growth and Mastery
What assessments may be used as
growth, mastery or both?
Mastery
42
Growth
What are the characteristics
of a quality assessment?
Write (3).
Report out the summary from your table.
43
Good assessments have……
A specific and defined
purpose
A reasonable time limit
for completion
A mixture of question
types
An appropriate
readability level
Items/tasks with
appropriate DOK levels
Multiple methods of
student demonstration
Items/tasks that are
Standards Aligned
Validity and reliability
A quality rubric
A standardized scoring
method
Academic Rigor
44
Well-written directions
and administration
guidelines
Cut scores for
performance categories
Academic Rigor
1.Standards-Aligned
2.Developmentally Appropriate
3.Focused on Higher-Order Thinking
45
Weighting, Linking, or Otherwise
1. Standard
You may consider each Performance Indicator equal in
importance.
2. Linked
You may link multiple Performance Indicators, if you
like. Do this for “pass before moving on” assessments.
3. Weighted
You may weight multiple Performance Indicators, if you
like. Do this when you believe one or more PI’s are
more complex or more important than others.
46
Standard Scenario
47
Name
Student
Proportion
Met Target
PI 1
Building a
Bridge Project
68/80
PI 2
Roller Coaster
Design
56/80
P1 3
Egg Parachute
40/80
𝑇𝑜𝑡𝑎𝑙 =
68 + 56 + 40 164
=
= 54.7%
80 + 80 + 80 240
Weighting Scenario
Physics Class with (3) PI targets:
Name
48
Weight
Student
Proportion
Met Target
68
80
PI 1
Building a 50%
Bridge
Project
PI 2
Roller
Coaster
Design
25%
56
P1 3
Egg
25%
Parachute
40
80
80
Points
Acquired
42.5
17.5
12.5
Total Score = 72.5%
Template
Section 3
49
Goal-Indicator-Measure
GoalStandards
Indicator
Indicator
#1
SLO Goal
(Big Idea)
Indicator
#2
50
Performance
Measures
Assessment
#1a
Assessment
#1b
Assessment
#2
Goal-Indicator-Measure
GoalStandards
SLO Goal
(Big Idea)
51
Indicator
Performance
Measures
Indicator
#1
Assessment
#1
Indicator
#2
Assessment
#2
Performance Measure - Descriptions
State the name of the assessment(s).
List the type of measure.
Explain the purpose, state what the Performance
Measure should measure.
Identify the timeline and occurrence(s)
Scoring Tools should indicate the solution key, rubric,
checklist, etc. that is being used to score the PM.
Administration & Scoring Personnel should contain who
is giving the test and who is scoring it.
Performance Reporting should state how others will
know which students met the Performance Indicator(s).
52
Template
Section 4
53
Teacher Expectations
Definition: identifies each level (Failing, Needs
Improvement, Proficient, Distinguished) students are
meeting the Performance Indicator Targets.
These reflect the continuum established by the teacher
prior to the evaluation period.
Each level is populated with a percentage range so that
there is distribution of performance across levels.
Based on the actual performance across all identified
Performance Indicators, the evaluator will determine one
of the four levels for the SLO.
54
Template
Section 5
55
Review
Tools for Review
SLO Coherency Rubric
School Leader’s SLO Checklist
Assessment QA Checklist
57
The Online Tool
http://www.pdesas.org/
Use the Homeroom link at bottom right
Click the RIA Homeroom site link in the top paragraph
Register and log in.
58
59
SLO & Assessment Literacy
Post-Test
60