Cross-institutional Repository Assessment: a Standardized

Download Report

Transcript Cross-institutional Repository Assessment: a Standardized

Cross-institutional Repository
Assessment:
A Standardized Model for
Institutional Research Assessment
Robert H. McDonald
Indiana University
Charles Thomas
Institute for Museum and Library Services
Outline
I.
II.
III.
IV.
V.
Introduction
The Need to Measure & Compare
New Candidate Frameworks
Representing New Metrics
Future Evolution of Institutional
Repositories and Evaluation
Institutional Assessment Needs
"We have found a remarkable shortage of clear,
accessible information about crucial aspects of
American colleges and universities...this lack of useful
data and accountability hinders policymakers and the
public...and prevents higher education from
demonstrating its contribution to the public good.“
A Test of Leadership: Charting the Future of U.S. Higher
Education (2006) – U.S. Department of Education
Institutional Repositories: A
Silver Bullet?
"An institutional repository concentrates the intellectual product created
by a university's researchers, making it easier to demonstrate its
scientific, social and financial value. Thus, institutional repositories
complement existing metrics for gauging productivity and prestige...this
demonstration of value can translate into tangible benefits, including the
funding...that derives in part from an institution's status and
reputation."
Raym Crow (2002). The Case for Institutional
Repositories.
Repositories Vary In
• what they contain;
• who funds and administers each;
• underlying legal, social and policy
infrastructure for each repository;
• who contributes to the repository; and
• motivations for contributing, whether
they be mandates, disciplinary cultural
norms, or other incentives
Current Repository Categories
• Institutional
• Disciplinary
• Other (preservation, publishers, etc.)
• How do you tell the difference?
• How do you know who contributes what
to which?
Need to Evaluate IRs
• How do we evaluate IRs?
• Institutional, disciplinary, etc. exist for
different purposes, probably need
different evaluative frameworks
Need to Evaluate IRs
• We can’t just measure our IR as a standalone phenomenon,
• We need to be able to compare IRs
• We also need to evaluate IRs for their
utility in overall Institutional Assessment
Library Assessment Needs
• 20th Century vs 21st Century
– Moving Beyond Silos of Knowledge
– Facilities are not an adequate measuring
stick
– Qualitative and Quantitative Measurement
Principles are Required
Frameworks for IR Evaluation
• Proudman, V. (2008). The population of
repositories.
–
–
–
–
–
Policies;
Organization;
Mechanisms and influences for populating repositories;
Services;
Advocacy & communication;
– Legal issues
Frameworks for IR Evaluation
• Westell, M. (2006). Institutional repositories:
Proposed indicators of success
–
–
–
–
–
–
–
–
Repository mandate;
Integration with institutional planning;
Funding model;
Relationship with digitization centers;
Interoperation;
Content measurement;
Promotion;
Preservation strategy
Frameworks for IR Evaluation
• Kim, H. H. and Kim, Y. H. (2007). An evaluation model
for the national consortium of institutional
repositories of Korean universities.
– Content (Diversity, Currency, Size, Metadata)
– System and network (Interoperability, Use of help services like FAQ
and Q&A)
– Use, users and submitters (Use ratio, User satisfaction, Submitter
satisfaction, User/Submitter support)
– Management and policy (Budget, Staffing, Library awareness of Open
Access and related issues, Copyright management, IR Marketing,
Institutional support, Policies and procedures in place, Diversity of
archiving methods)
What Are We Seeing?
• Lots of Case Studies
• Many Qualitative Evaluative Criteria
• Tips, Best Practices for Good Repositories
• Not Much Quantitative Data –Warning,
Administrators Love Numbers!!!
Library Assessment Needs
“Key Aspects of collaborative relations may be
described only in qualitative terms in the future.”*
– Cross-Institutional
 Shared Digital Collections
– Intra-Institutional
 IR Collection Building
 IR Assessment
 Institutional Research Assessment
*From Reshaping ARL Statistics to Capture the New
Environment (2008) – Kyrillidou
So How Do We Mix
Qualitative/Quantitative?
The Color Palette Metaphor
Absence of color =
Absence of
Foundations for
Success-indication
of early forming or
orphan IR
The Color Palette Metaphor
White = max
combo of entire
spectrum=Ideal IR
with full suite of
necessary support
The Color Palette Metaphor
Shades of Gray or
other color
attributes indicate
a rising IR
Future Evolution
• Institutional Measurement
• Institutional Research a role for Libraries
• Libraries as Publisher
From Educause Review
43(1)
Administrative ERP Stack
Fusion or Data Mining
• Where does the .EDU stack
come together for analysis?
• Can the library play a role in
this analysis?
• Needed for owned and
leased assets
Intra-Institutional Assessment
IRStats
Digital Measures – Activity Insight
U Penn Data Farm
Layers of Assessment Comparison
• International Comparison
• National Comparison
• National Accreditation
• Regional Accreditation
• State and Regional
Collaboration/Funding
• Internal Collaboration/Funding
Missing Link
• IR Assessment
– Quantitative
– Qualitative
– Viable or Useful Mixed Visualizations
CONTACT INFORMATION
• Robert H. McDonald
– [email protected]
– AIM/[email protected]
– Skype/rhmcdonald
• Chuck Thomas
– [email protected]