Transcript Document

Assessment Strategies in the Music
Classroom
Dr. Phillip M. Hash, Calvin College
[email protected]
November 10, 2014
Overview of AM Workshop
1.
2.
3.
4.
5.
Current Trends
Evaluation strategies
Assessment Resources
Assessment Basics
Assessment Strategies
1.
2.
3.
6.
Performance
Psychometric Tests
Festival Ratings
Your Experience
Current Trends in MI Teacher
Evaluation
EVALUATION IMPROVEMENTS IN
HB 5223 & HB 5224
• Student growth must be determined by 2 or more
measures
• All similarly situated teachers must be evaluated on
same measurements & assessments
• 2014-15 – growth data = 25% (vs. 40%) of evaluation
• 2015-16/2016-17 – growth data = 25% (12.5% state
tests & 12.5% local measures for teachers of tested
subjects)
• 2017-18 – growth data = 40% (20 % state tests &
20% local measures for teachers of tested subjects)
Tools Used in Local Evaluation
of Teachers (2012-13)
Current Trends:
Effectiveness Ratings for 2011-13
Assessment Practices
MSBOA Informal Survey (2012)
• N = 76 MI School Band &
Orchestra Teachers
• Number of Growth
Assessments Administered
to Students
–
–
–
–
Unspecified (n = 3)
1 (n = 41)
2 (n = 21)
3 (n = 10)
• Types of Assessments by
Teacher
–
–
–
–
–
–
Individual Performance (n = 40)
Group Performance (n = 31)
Psychometric Test (n = 31)
Building Measures (n = 5)
Composition (n = 1)
Student Reflection (n = 5)
MSBOA Informal Survey (2012)
• Additional Observations
– Scope of all assessments
varied widely
– Some districts require
psychometric tests
– Many teachers are utilizing
technology such as
SmartMusic, Garage Band,
Audacity, Data Director, Smart
Phones, Ipads, etc.
– Survey available on MSBOA
website
Insuring Integrity
• Demonstrate validity &
reliability
• Demonstrate connection
b/w state standards and
assessments
• Explain/demonstrate
process for creating,
administering, & grading
• Archive recordings & other
student work
Evaluation Strategies
Basic Principles
•
•
•
•
Assessment = Art of the Possible
Growth vs. Achievement
Multiple Measures (3?)
Should NOT Dominate
– What do you already do?
•
•
•
•
•
Meaningful & Useful vs. “Hoops”
Individual Student Progress
Skills & Concepts vs. “the piece”
Not necessarily for a Grade
Consistent across District
Evaluation Strategies
• Always have lesson plans connecting to standards
– See MI GLCE
– Incorporate as many standards as make sense for your class – but not
just perform and read notation
• Study the evaluation form
• Plan lessons using evaluation rubric as a guide
• Be prepared to provide evidence of instructional & professional practices
– Student work, rubrics, lesson plans, parent call log, etc.
• Use a variety of instructional practices.
• Focus on student engagement.
• Don’t try to put on a show for evaluator
• [Is it time to reconsider the number of performances per year??]
NAfME Evaluation Workbooks
• Philosophical Premise
– “Good music teacher evaluation is
not only about valid & reliable
summative evaluation, but it is also
about quality formative professional
development.”
• “[Intended] to provide a helpful
tool to music educators,
principals and/or supervisors
engaged in the entire process of
professional development. It
should be used as a guide to
personal reflection and
improvement.”
• Part 1: Instruction
Manual
• Part 2: Ensemble
Teacher Evaluation
Summary Form: Criteria
for Evaluation (based on
Danielson)
• Part 3: Evaluation
Worksheets
• Appendix - Resources
Danielson Example 1e
NAfME Workbook Example
Secondary 1e
Danielson & NAfME (GM) 1f –
Designing Student Assessments
Developing Local
Assessment Strategies
Creating an Assessment Plan
• District Music Faculty (by area)
– Est. curriculum based on MI Standards
• What should students in each grade level know and be
able to do?
• How and when will objectives be assessed?
– Perhaps not every grade every year
• How will assessments show growth? (e.g., difference in
% b/w pre- post test, defined NP, PP, P, HP?)
• Take plan to administration for approval
– Law says that “with the involvement of teachers”
• Pilot, Review, Revise, Implement
MI Grade Level Content Expectations
(June 2011)
• What students should know
and be able to do in grades
K-8, & HS
• Aligned w/ VPAA & 21st
century skills
• Standards, & benchmarks
by grade level
• Teachers evaluated on use
of standards
• [See handout]
Assessment Terms
• Reliability = Consistency
– Test/retest (regardless of yr., location, etc.)
– Interrater (every judge the same)
• Validity = the extent to which an assessment measures what
they purport to measure
• Authentic Assessment = Students demonstrate knowledge
and skills in real-world context (e.g., performance)
• Quantitative – data is numerical (anything that can be
counted, percentages)
• Qualitative – data is in words (descriptions, written critiques)
• Formative vs. Summative – practice vs. final
• Formal vs. Informal - Planned & produced vs. on the spot
Assessment Terms - RTTT
• Rigorous
– assessments that measure grade-level standards
• Two points in time
– pre- & post-test
– proficiency from one year to the next
– ongoing assessments of musical skills (steady beat, pitch
matching, singing, recorder, instrumental performance,
sight-reading, etc.)
• Comparable across classrooms
– same for all teachers at a particular level or area
– assessments comparable in rigor to other subjects
Resources
Resources
Wendy Barden (Kjos)
Paul Kimpton (GIA)
www.vocaroo.com
• Audio emails
• Archived up to 5
months
• Sends link to an email
address
• Download as .WAV or
.Ogg
• Useful for performance
tests
• Very easy!
•
http://vocaroo.com/?media=vAdx5RJr1DVC7upIc
SmartMusic©
•
•
•
•
•
•
•
•
Interactive practice and assessment
tool
Extensive Library
Create, send, and grade assignments
Students record performance and
submit the grade (%), assessment
screenshot, and recording.
Correct notes and rhythms in green/
incorrect in red
Accuracy of notes and rhythms only
Most objective
Educator = $140; Student = $40
Rubistar
http://rubistar.4teachers.org/
• Create rubrics using
existing descriptors
• Search other teachers’
rubrics for samples
– Edit to fit your needs
Student Growth Measures
Checklists
1. Define activity or task (e.g.,
students will sing “Brother
John” on pitch)
2. Define criterion (student
sings on pitch)
3. Conduct the Assessment:
– Scale =
• Yes (+ or 2)
• Sometimes (1 or *)
• No (0 or -)
– Embedded into instruction
Student’s Name
Singing on Pitch
“Brother John”
Trial 1
John
Bill
Susan
Sherri
Damon
Etc.
Trial 2
Maintaining Steady
Beat w/ Orff
Accompaniment
Trial 1
Trial 2
The Systemic Assessment:
Maintaining Vocal Independence in a 2- or 3-part
Vocal Context
The Activity/Task: The students will sing “Inanaya” in a 2 or 3 parts,
maintaining her/his own voice part independently.
Criterion: The student maintains her/his own part independently in a
multiple part context.
Assessment:
Inform the students that you’ll be observing their performance of Inanaya
and keeping track of who is maintaining their part in the harmony and
who is not. Describe the scoring procedure to the students, and ask for
questions.
Multilevel, single criterion scoring procedure:
“+” maintains vocal independence consistently
“~” vocal independence is inconsistent
“|” does not maintain independence
Dr. Tim Brophy – Univ. of FL
Sample Data Collection Instrument –
Vocal Independence Data
Date/
Assessment
Jimmy
Sherree
Ida
LeDarrius
1/15/13
Vocal
Independence, 3
parts
|
|
+
~
1/22/13
Vocal
Independence, 3
parts
+
~
+
+
Rubrics
• Types include:
– Holistic (overall performance)
– Analytic (specific dimensions of performance)
– Additive (yes/no)
• Descriptors must be valid (meaningful)
• Scores
– Must be reliable (consistent)
– Should relate to actual levels of students learning
• Can be used by students for self-assessment and
to assess the performance of other students
• Give to students b/f assessment
14
What does a rubric look like?
TONE
Beginning
Basic
Proficient
Advanced
Breathy;
Unclear;
Lacks focus;
Unsupported
Inconsistent;
Beginning to be
centered and
clear; Breath
support needs
improvement
Consistent
breath support;
Centered and
clear; Beginning
to be resonant
Resonant;
Centered;
Vibrant;
Projecting
Features:
• Scale includes rating points (at least 4).
• See next slide & handout for sample headings
• Highest point represents exemplary performance
• Criterion-based categories (3-5 work best)
• Descriptors are provided for each level of student performance
• Pre- and/or Post-test. Teacher, peer, & self assessment
Adapted from: K. Dirth, Instituting Portfolio Assessment in
Performing Ensembles, NYSSMA Winter Conference, Dec. 2, 1997.
13
Constructive Rubric Headings
Recorder Karate Rubric
Pennsbury School District
Fallsington, Pennsylvania 19058
Students earn a belt if they can perform the
given song at a level 4 or above.
5 – Student plays with good tone and very few
mistakes in pitches and rhythm
Holistic
Rubric
4 – Student plays with good tone and a few
mistakes in pitches or rhythm
3 – Student plays with acceptable tone and
several mistakes in pitches or rhythm
2 – Student plays with acceptable tone and many
mistakes in pitches, rhythm or fingering a
particular note
1 – Student plays with poor tone, many mistakes
in pitches and rhythm, many stops and starts,
and/or seems very unsure of fingerings
Holistic Rubric
Piano Rubric - Analytic
Quiz #1
Scales
Two octaves, hands together, ascending and descending
____________
Fluency
Keys
1 point
2 points
3 points
4 points
Not Yet Successful
Developing
Satisfactorily
Successful
Highly Successful
Student performs
with many hesitations
and quite a few
mistakes
Students performs
with few
hesitations and
mistakes
Student performs
Student performs
with almost no
with no hesitations
hesitations and very or mistakes
few mistakes
Correct Fingering Student performs
with mostly incorrect
fingerings
Student performs Student performs
with some
with mostly correct
incorrect fingerings fingerings
Student performs
with correct
fingerings
Tempo
student performs at Student performs at
a somewhat slow a moderate tempo
tempo (Adagio)
(Andante)
Student performs at
a fast tempo
(Allegro)
Student performs at
a very slow tempo
(Largo)
Score
1/6/12
Sample Rating
Scale vs. Analytic
Rubric
12
Additive
Rubric
Showing Growth w/ Rubrics
(or any other pre- post-test)
• Pre- & post-test
• average class posttest %
- average class pretest
% = % growth
Post
67
79
59
90
82
58
Pre
57
65
32
80
72
45
% growth
10
14
27
10
10
13
72.5
58.5
14
Est. Personal Reliability
•
•
•
•
Record 10 students
Grade w/ rubric
Grade again in 2 weeks
Measure the difference
in score for each
recording
• Calculate average
difference
• Lower = better
Trial 1
Trial 2
Difference
9
9
0
6
7
1
8
6
2
11
10
1
9
7
2
12
10
2
4
4
0
Av. Diff. 1.14
Rate these 6 recorder performances on a scale of 1-12
Rate the same examples using rubric in handout
Trial 1
1 _____
2 _____
3 _____
4 _____
5 _____
6 _____
Recorder Trial 2
• Use rubric
• Training
– Procedures
– Definitions
• Add up score
• Match score from Trial 1
to Scores from Trial 2
• Is there a difference?
• In which scores are you
most confident?
Progressive Curricula – Levels of
Achievement
• Jason Lowe – Beal City Public Schools (MS &
HS examples)
– Fundamental (MS)/Comprehensive (HS)
Musicianship Battery
– http://bealcitybands.weebly.com/ or
http://pmhmusic.weebly.com
• MSBOA Proficiency Levels (only 3)
• ASBDA Curriculum Guide (pub. by Alfred)
• Royal Conservatory Music Development
Program
RCMDP Syllabi Components (10-11 levels)
http://www.musicdevelopmentprogram.org/
• Repertoire (a & b lists)
• Technical Req. (scales, arpeggios)
• Ear Training
– Intervals
– Clapback
– playback
• Sight reading
• Theory & History Tests are available
• Adapt as needed
Royal Conservatory
Music Development Program
(see handout)
• Recorder, strings,
• Includes solos, etudes,
woodwinds, brass,
scales/arpeggios, ear
percussion, voice
training, sight reading,
theory
• Graded preparatory, 1-10
• Curricula online
– RC Grade 8 considered
college entrance
• Adapt for your program
[Refer to HS Orchestra Example & “Strategic Testing” article in Handout]
PSYCHOMETRIC TESTS
Uses
•
•
•
•
•
•
Theory
History
Listen
Analyze
Describe
Evaluate
Psychometric Tests
Eimer (2007) [See sample HS orch. exam in handout]
• Goal = Test Clarity &
Reduced Anxiety
• Give study guide
• Same basic format and
scoring for every test
• Reasonable length
• No clues w/in the test
• Test important
information/concepts
• Avoid T/F
– Unreliable
• Matching
– Only facts
– No more than 10 per set
– Same type/topic for each
set
– Let student know how many
times to use an answer
Multiple Choice
11. ______ “Spring” from Vivaldi’s Four Seasons is
• Incomplete sentence
(stem) w/ clear answer &
2-3 distractors
• Match grammar b/w
stem & choices
• Choices alpha/numerical
• Stem longer than choices
• Avoid all/none of the
above, a & c, etc.
a. an early example of program music.
b. based on sonnets by a famous poet.
c. scored for strings, winds, and percussion.
12. ______ A classical symphony generally has movements arranged
a. fast-minuet-slow-fast.
b. fast-slow-minuet-fast.
c. fast-slow-slow-fast.
13. ______ Orchestral music of the classical era typically features
a. clear, symmetrical phrases.
b. polyphonic texture.
c. the brass section.
Psychometric Tests
• Essay & Short Answer
– NOT for factual info
– Make connections, use higher order thinking skills, evaluate
understanding
– Make expectation clear in question
– Grade w/ wholistic rubric
• [See HS Orchestra Example]
– Notate & Respond
Elementary General Music –
Grade 3 Pre- & Post Test Sample
•
•
•
•
•
•
[See handout]
Paper/pencil, but relies on musical response
Prompts can be different for pre-test
Pre-test can be an abbreviated version
Require 2-3 class periods to complete
Music supervisor could issue musical examples &
prompts before the test (avoid teaching to the test)
Creating Similar Elementary
General Music Assessment
• For grades 3-5, determine what GLCEs can be
measured through paper/pencil response
• Create question(s) for each benchmark –
deliberately connect question to GLCEs
(validity, rigor, comparable a/c classrooms)
• Decide # of questions needed to determine
competency
• Create questions that fit different prompts
Excellence in Theory or Standard of Excellence
Music Theory & History Workbooks
• Kjos - publisher
• 3 volumes (see handout
sample)
• Includes theory, ear training,
history
• Take MS & HS to complete 3
volumes
• Students work on lessons
during down time in rehearsal
• Establish grade level
expectations and written exam
Festival Ratings
NAfME Position Statement
• Successful music teacher evaluation must,
where the most easily observable outcomes of
student learning in music are customarily
measured in a collective manner (e.g.,
adjudicated ratings of large ensemble
performances), limit the use of these data to
valid and reliable measures and should form
only part of a teacher’s evaluation. (NAfME,
2011)
Festival Ratings:
Advantages/Disadvantages
Advantages
• Third party assessment
- Credibility
• Focuses on a major
aspect of ensemble
curr.
• Final ratings are likely
reliable over time
•
•
•
•
Disadvantages
Narrow: 3 pieces &
sight reading at one
point in time
Ceiling effect
Subject to outside
influences
Role of MSBOA?
Ratings Growth Example
Hypothetical Contest Ratings for One Ensemble over a Three-year Period
SightAnnual
Judge 3
Average
Reading
Increasea
Judge 1
Judge 2
Year 1
II
III
II
II
2.25
-
2
Year 2
II
II
I
II
1.75
22%
2
Year 3
I
II
I
I
1.25
29%
1
Note: aTotal increase from year 1 to year 3 = 44%.
Final
Experiences
Describe Your Situation
• In roundtables by area?
• How are you measuring
student growth at your
school?
• What support are you
getting?
• What needs or concerns
do you have?