Turning Results into Action: Using Assessment Information to

Download Report

Transcript Turning Results into Action: Using Assessment Information to

Turning Results into Action: Using Assessment
Information to Improve Library Performance
OR WHAT DID YOU DO WITH THE DATA?
Steve Hiller
Stephanie Wright
University of Washington Libraries
Data Sources For This Study
• Library Assessment ARL • ARL Consultation Service
SPEC Kit 303 (Dec. 2007)
Making Library
Assessment
• Survey sent to ARL
Work/Effective,
Libraries May-June 2007
Sustainable and Practical
• 73 respondents (60%),
Library Assessment
nearly all academic
• Self-reported information • 35 Libraries visited 200508 (32 North A., 28 ARL)
• Observed and confirmed
information
Library Assessment SPEC Kit Survey
Questions
•
•
•
•
•
•
Impetus for assessment
Assessment methods used
Organizational structure for assessment
Distribution/presentation of assessment results
Using assessment information (up to 3 actions)
Professional development needs in assessment
SPEC Survey: Impetus for Assessment
Desire to know more about your customers
91%
Investigation of possible new library services/resources
71%
Desire to know more about your processes
65%
Desire to identify library performance objectives
62%
Need to reallocate library resources
55%
Accountability requirements from parent institution
37%
Institutional or programmatic accreditation process
29%
Building Assessment Capability in
Libraries through Consultation Services
• Association of Research Libraries (ARL) project began
in 2005 as “Making Library Assessment Work” (MLAW)
Assess the state of assessment efforts in individual research
libraries, identify barriers and facilitators of assessment, and
devise pragmatic approaches to assessment that can flourish in
different local environments”
• Funded by participating libraries
• Conducted by Steve Hiller and Jim Self under the aegis
of Martha Kyrillidou of ARL
• In 2007 name changed to “Effective, Sustainable and
Practical Library Assessment” (ESP) and opened up to
all libraries
MLAW/ESP:
Data Collection Methods
Pre-Visit
• Survey on assessment activities, needs etc.
• Telephone follow-up
• Mining library and institutional web pages
Visit (1.5 days)
• Presentation on effective assessment
• Group meetings and observation/verification
Follow-up and report
• Pursue leads and additional information
ESP Self-Identified Assessment Needs
(31 NA Libraries)
Use Data Effectively
97%
Data Collection
90%
Data Analysis
87%
Staff Data Use Skills
80%
Learning Outcomes
70%
Build Assessment
Culture
67%
Accreditation
47%
0%
25%
50%
75%
100%
Data Caveats
•
•
•
•
Different methodological techniques used
Information gathered at different times
ESP confirmed on “ground”; SPEC self-reported
Libraries are different (21 of 73 SPEC survey
respondents also participated in MLAW/ESP)
• Some bias in libraries self-selecting to participate in
ESP and respond to SPEC survey (likely that more is
done in these libraries)
Assessment Methods Used
SPEC
Data collection
ESP
100%
100%
Web usability testing
80%
80%
LibQUAL+® survey
75%
100%
Focus groups/Interviews
75%
80%
Facilities use studies
55%
60%
Student instruction evaluations
55%
75%
Observation
50%
65%
Benchmarking and process improvement
50%
50%
Other locally developed surveys
50%
75%
ACTION SCOREBOARDS
(Data will be in paper)
Score
Widespread
Occasional to general
Sometimes to occasional
Seldom
Action Scoreboard: Websites
SPEC
Website redesign
Change content
Change library catalog display
ESP
Action Scoreboard: Facilities
SPEC
Renovate existing space
Repurpose existing space
New furniture/equipment
Environmental (HVAC, lighting)
Close libraries/service points
Signage
Plan new space
ESP
Action Scoreboard: Services
SPEC
Hours
Service Desk Staffing
Service Quality
Instruction
Process improvement
ESP
Action Scoreboard:
Collection Development and Management
SPEC
Journal review/decisions
Going Electronic
Weeding, relocation, storage
Fund allocations
Scholarly communication
ESP
Action Scoreboard: Organization
SPEC
Organizational climate
Staff Training
Marketing
Communications (external)
Collaborations (external)
Stop doing specific activities
ESP
Overall Action Scoreboard
SPEC
Web site
Facilities
Collection development
Reference services
Access services
Instructional services
Hours
Organizational changes
ESP
Using Assessment Data: Actions
• Lots of data collected but actions generally limited to
either “low hanging fruit” or one-time changes:
–
–
–
–
Website improvements (Usability)
Hours (Comments, observation)
Collection development/management decisions
Facilities (Observation, qualitative methods)
• More actions taking place than reported (both to
SPEC and MLAW/ESP)
• Little evidence of action in:
– Instruction/Learning outcomes
– Organizational changes
Organizational Factors That Impede
Turning Data into Actions
•
•
•
•
•
•
•
•
•
Lack tradition of using data for improvement
No assessment advocate within organization
Library staff lack research methodology abilities
Weak analysis and presentation of data
Inability to identify actionable data
Library “culture” is skeptical of data
Leadership does not view as priority/provide resources
Library organizational structure is “silo-based”
Staff do not have sufficient time
Sustainable Assessment and Actions
•
•
•
•
•
•
•
•
•
Leadership believes and supports
Formal assessment program established
Institutional research agenda tied to strategic priorities
Staff training in research/assessment methodology
Staff have time and resources to follow-up
Research balanced with timely decision-making
Assessment results presented, understood and acted upon
Results reported back to the customer community
Library demonstrates value provided community