Usability Engineering

Download Report

Transcript Usability Engineering

Usability & Evaluation
in Visualizing Biological Data
Chris North, Virginia Tech
VizBi
Usomics & Evaluation
in Visualizing Biological Data
Chris North, Virginia Tech
VizBi
Myths about Usability
Usability = Voodoo
Science of Usability
Phenomenon
Engineering
Measurement
Science
Modeling
…analogy to biology
Usability Engineering
1. Analyze
Requirements
4. Evaluate
2. Design
User-centric
3. Develop
Iterative
Engineering = process to ensure usability goals are met
Myths about Usability
Usability = Voodoo
Usability = Learnability
Myths about Usability
Usability = Voodoo
Usability = Learnability
Usability = Simple task performance
Impact on
Cognition
Insight gained:
90
80
66
70
Value
60
50
40
48
51
40
34
30
20
10
0
Spotfire
GeneSpring
Myths about Usability
Usability = Voodoo
Usability = Learnability
Usability = Simple task performance
Usability = Expensive
http://www.upassoc.org/usability_resources/usability_in_the_real_world/roi_of_usability.html
Usability Engineering
1. Analyze
Requirements
4. Evaluate
2. Design
3. Develop
Requirements Analysis
Goal = understand the user & tasks
Methods: Ethnographic observation, interviews, cognitive
task analysis
Challenge: Find the hidden problem behind the apparent
problem
Analysts’ Process
Pirolli & Card, PARC
Systems Biology Analysis
Beyond read-offs -> Model-based reasoning
Mirel, U. Michigan
Usability Engineering
1. Analyze
Requirements
4. Evaluate
2. Design
3. Develop
Why Emphasize Evaluation?
Many useful guidelines, but…
Quantity of evidence
Exploit domain knowledge
Hunter, Tipney, UC-Denver
Science of Usability
Phenomenon
Measurement
Modeling
Measuring Usability in Visualization
Phenomena
system,
algorithm
• frame-rate
• capacity
•…
Measurements
visual
• realism
• data/ink
•…
perception,
interaction
• task time
• accuracy
•…
inference,
insight
•?
goal,
problem
solving
• market
•?
2 kinds of holes
Time & Accuracy
Controlled Experiments
Benchmark tasks
Results
1 Tpt
M Tpts.
M. Graphs
Performance Time
Time (in min)
3
2.5
2
1.5
1
0.5
0
Tasks
1 Tpt
T1*
M Tpts.
T2
T3
M. Graphs
T4*
T5*
T6
T7*
T6*
T7
Accuracy
10
Count
8
6
4
2
0
Tasks
T1
T2
T3
T4*
T5
+ Consistent overall
+ Fast for single node analysis
- Slow and inaccurate for
expression across graph
+ Accurate for comparing
timepoints
p<0.05
Cerebral
Munzner, UBC
Insight-based Evaluation
Problem: Current measurements focus on low-level task
performance and accuracy
What about Insight?
Idea: Treat tasks as dependent variable
What do users learn from this Visualization?
Realistic scenario, open-ended, think aloud
Insight coding
Information-rich results
Insight?
GeneSpring
Spotfire
Gene expression
visualizations
HCE
Cluster/Treeview
TimeSearcher
ClusterTimeView
Searcher
Results
HCE
Spotfire
GeneSpring
30
25
25
21
20
18
Count
Count of
insights
20
14
15
10
5
90
0
80
66
70
Total value
of insights
Value
60
50
48
51
40
34
40
30
20
10
0
Average time
to first insight
(minutes)
Avg Time to First Insight
18
16
16
14
14
12
10
8
7
8
6
4
2
0
4.6
Insight Summary
Time series
Clusterview
TimeSearcher
HCE
Spotfire
GeneSpring
Viral
conditions
Lupus
screening
Users’ Estimation
90
ClusterTimeView
Searcher
HCE
Spotfire
GeneSpring
80
66
70
Value
Total value
of insights
60
50
48
51
40
34
40
30
20
10
0
80
67
Avg Final Amount
70
Users’ estimated
insight percentage
60
50
41
40
30
20
10
0
52
48
42
Insight Methodology
Difficulties:
Labor intensive
Requires domain expert
Requires motivated subjects
Short training and trial time
Opportunities:
Self reporting data capture
Insight trails over long-term usage – Insight Provenance
Trend towards Longitudinal Evaluation
Multidimensional in-depth long-term case studies (MILCS)
Qualitative, ethnographic
Shneiderman, U. Maryland
GRID:
Study graphics,
find features,
ranking guides insight,
statistics confirm
But: Not replicable,
Not comparative
Onward…
VAST Challenge
Analytic dataset with ground truth
E.g. Goerg, Stasko – JigSaw study
BELIV Workshop – BEyond time and errors: novel
evaLuation methods for Information Visualization
Visual Analytics
Visualization
Visual Analytics
Perception,
Interaction
Cognition,
Sensemaking
Visualization tasks
Whole analytic process
Visual representations,
interaction techniques
Connection to data
mining, statistics, …
Datatype scenarios
Real usage scenarios,
Analysts
Embodied Interaction
1) Cognition is situated.
2) Cognition is time-pressured.
3) We off-load cognitive work onto the environment.
4) The environment is part of the cognitive system.
5) Cognition is for action.
6) Off-line cognition is body-based.
-- Margaret Wilson, UCSC
GigaPixel Display Lab, Virginia Tech
Carpendale, U. Calgary