Lecture 5: Usability Testing
Download
Report
Transcript Lecture 5: Usability Testing
Ch 11 Usability Assessment
Yonglei Tao
School of Computing & Info Systems
GVSU
Common Usability Problems
Reported by IBM Usability Experts
Ambiguous menus and icons
Single-direction movement through a
system
Lack of white space
Annoying distractions
Unclear step sequences
Usability Problems (Cont.)
Input and direct manipulation limits
More steps to manage the interface
than to perform tasks
Lack of system anticipation and
intelligence
Inadequate feedback and confirmation
Inadequate error messages, help,
tutorials, and documentation
Case Study
Information retrieval tasks for INTUITIVE
Navigation and exploration
Query formation
Translate user’s need, expressed via a graphical
notation, to SQL internally
Previewing the retrieved data
User’s knowledge about the database is imprecise
Need to show what is in the database and allow the
user to select entities for queries
Allow the user to select a subset of retrieved items
Presentation of retrieved items
Usability Evaluation
User testing
Users were videotaped and timed when
performing increasingly complex tasks
Captured data including time taken,
errors, help accessed, and task steps
missed
Expert evaluation
Using Nielsen’s heuristic checklist by HCI
expert evaluators
Experiment Results
Error count
Cost
Experts identified 86 usability problems
Users identified 38, not a subset of the above
33.5 hours for heuristic evaluation
125 hours for end user testing
More expensive to hire HCI experts
Ease of problem fixing
HCI experts are better at accurate, thorough
reporting and identifying causes
Experiment Results (Cont.)
Problems identified by end users, not
by HCI experts
39% of those identified by users
Some examples
Repeated resubmission of the same queries
Stuck on creating query
Cannot make sense of results
Difficult in moving windows
Experiment Results (Cont.)
Problems identified by both end users
and HCI experts
More feature related, rather than
performance related
Some examples
No complete view of the query – only one
entity at a time
Menu terminology confusing
Attribute selection unpredictable
Experiment Results (Cont.)
Problems identified by HCI experts, not
by end users
40% of those identified by experts
Some examples
No arrows indicating the direction of relations
Not able to cancel the operation after a query
is submitted
Slow response time to display results
There should be a clean up command
Discussions
Heuristic evaluation
Identify an interface error by predicting
user problems it will cause
Good at finding poor terminology and
lack of clarity
Ten heuristics are inadequate as a guide
A subjective process
Discussions (Cont.)
User testing
Identify the symptom and infer its cause
Good at finding problems while
performing real tasks
Task-based
May miss features not encountered in
tasks
Users tend to blame themselves rather
than the interface
Summary
Both techniques share the same goals, but
the actual results are quite different
User testing indicates the symptom of a problem
Heuristic evaluation identifies its cause
Heuristic evaluation helps analyze observed
problems. But observation of novices is still
vital as many problems are a consequence
of the user’s knowledge, or lack of it.
Summary (Cont.)
Usability testing is costly and timeconsuming
Necessary to use a variety of techniques
Need to focus on the aspects each one does best