Assessing Library Web Portals - Eastern New York ACRL Chapter

Download Report

Transcript Assessing Library Web Portals - Eastern New York ACRL Chapter

Assessing Library Web Portals:
Usability and Beyond
Yu-Hui Chen
University at Albany, State University of New York
ENY/ACRL 2012 Conference
Mohawk Valley Community College, Utica, New York
May 21, 2012
Web Site Evaluation Methods

Evaluation with user participation
◦
◦
◦
◦
◦
◦
◦
◦

Think aloud
Card sorting
Prototyping (paper/online)
Eye tracking
Focus group discussions
Field study
Log analysis
Web Survey
Evaluation without user participation
◦ Cognitive walkthrough
◦ Heuristic evaluation
Think Aloud

Users are asked to complete specific
tasks

As users are attempting to complete each
task, they verbally report their thoughts
and feelings of their actions

Observers watch, listen, and takes notes
Card Sorting

Open card sorting
◦ Give users labels representing the content of
the Web site
◦ Users review these labels and then group
them into categories.
◦ Users assign category names to these groups

Closed card sorting
◦ Provide category names for users
◦ Users sort the labels into categories
Prototyping (Paper/Online)

Provide users with descriptions and
purposes of an intended Web site

Have users brainstorm the design

Have users draw the design

Test the design
Eye Tracking

Setting up a lab

Training users in using the equipment

Giving users tasks

Review the reports
Cognitive Walkthrough

The evaluators design specific task scenarios

The user’s goals and purpose for each task
are defined and tasks are broken down into
relatively small pieces

The evaluators role play the part of the user
working with the site, noting problems, path,
and barriers, essentially reviewing the ease of
learning the site
Heuristic Evaluation

Have a small group of evaluators (2-5) review
the site using usability heuristics (e.g., Nielsen),
standards (e.g., ISO), or guidelines (e.g., US
Dept. of Health and Human Resources)
◦ Inspect the task flow
◦ Inspect details of individual elements

Evaluators review the site independently

Reconvene and discuss findings
Information Systems Success Model
DeLone, W. H., & McLean, E. R. (2003). The DeLone and McLean model of
information systems success: A ten-year update. Journal of Management
Information Systems, 19(4), 9-30. (p. 24)
Measures of Information Quality
Accuracy
 Currency
 Sufficiency
 Reliability
 Relevance
 Format options

Measures of System Quality
Accessibility
 Ease of use
 Flexibility
 Response time
 Reliability

Measures of Service Quality
Assurance
 Empathy
 Responsiveness
 Reliability

Measures of Use
Frequency of use
 Extent of use
 Motivation to use

Measures of User Satisfaction
System quality satisfaction
 Information quality satisfaction
 Service quality satisfaction
 Overall satisfaction

Measures of Net Benefits
User productivity
 User performance

Assessment Approach
Quantitative
 Qualitative

Bibliography





DeLone, W. H., & McLean, E. R. (2003). The DeLone and McLean model of
information systems success: A ten-year update. Journal of Management
Information Systems, 19(4), 9-30.
International Standards Organization (1994). Ergonomic requirements for office
work with visual display terminals. Part 11: Guidance on usability (ISO DIS 9241-11).
London: International Standards Organization.
Nielsen, J. (1993). Usability engineering. Boston, MA: Academic Press.
Popp, M. P. (2001). Testing library Web sites: ARL libraries weigh in. Proceedings of
the ACRL Tenth National Conference, 277-281.
United States Department of Health and Human Services. (2006). Research-based
Web design & usability guidelines. Washington, DC : U.S. Government Printing
Office.
Other entertaining resources:
 Chen, Y., Germain, C. A. , & Yang, H. (2009). An exploration into the practices of
library Web usability in ARL academic libraries. Journal of the American Society for
Information Science and Technology, 60(5), 953-968.