www.procams.org

Download Report

Transcript www.procams.org

Human Information Workspace
(Hl-SPACE): Evaluation of a class of
Projector-Camera Systems
Richard May
Bob Baddeley
Projector-Camera Systems Workshop
June 25
San Diego, CA
What is ‘Direct’ Manipulation
Direct Manipulation – Shneiderman 1991


Defined around computer interaction
Interaction is a closed-loop
Cognitive and Functional Seams – Ishii 1994
Directly Mediated Interaction


Using the best of both worlds
Supplementing current technology
2
HI-Space Configuration
Projector & IR Sources
under table
Projector & mirror
orientations optimized to
screen
Video camera(s) above the
table
Off-the-shelf hardware with
custom software
3
Architecture
Libraries
(C++)
Window manager
JAVA
Wrapper
Recognizers
Pointer/Pose
Object/Shape
Speech
Recognizers
Gesture
Multimodal
User Definable
Events
Windows
Component
HI-SPACE
Application
4
Interface Example
Interactive Scientific Visualization Techniques Project
5
Performance Studies
Does using a DMI environment perform any different than
using a mouse on a desktop system for selecting
stationary targets?
Decided to run Fitts’ studies
2A
ID = log 2
bits / sec
W
6
Fitts’ Model Study Example
Discrete tapping task
Left / Right motion only
Subjects


Under 40 years old
Right handed
Used a 12 inch hand held stylus
7
Direct Comparison Study
1200
1000
Mean Time (msecs)
The mouse was slower and
had a worse IP.
No difference in error rates.
However …
800
Mouse
600
HSE
400
200
0
0
2
4
6
8
Index of Difficulty (bits)
Mean Time (ms)
Error Rate
Index of Performance
(bits/s)
Y intercept (ms)
Interaction Type
Mouse
HSE
667
505
4.2%
3.2%
7.1
10.4
45
81
Mouse
HSE
% misses by target Width (pixels)
14
28
56
112
9.38%
4.37%
2.39%
0.51%
10.85%
1.51%
0.34%
0.00%
8
Total System Lag Determination
Photodiode
Timer Unit
Armature
Based on a technical paper from UNC [Mine 1993]
9
10 fps
30 fps
Total System Lag Results
Normal
Reduced Image
Processing
Reduced Application
Processing
Skipped Frames
Set 1
Set 2
Set 3
Mean Time
(ms)
71.13
70.39
71.48
Standard Deviation
(ms)
11.059
11.038
11.025
95% Mean
Error (ms)
1.533
1.530
1.528
Set 1
215.20
31.105
4.311
Set 1
116.67
32.280
4.474
Set 1
99.87
28.607
3.965
All sets consist of 200 trials
10
Lag Study
800
71 ms
No significant performance
change caused by lag.
Mean Time (ms)
Error Rate
Index of
Performance (bits/s)
Y intercept (ms)
Lag (ms)
71
237
371
433
448
448
4.4% 3.5% 4.0%
10.8
9.9
10.6
24
5
32
Mean Time (msec)
700
237 ms
600
371 ms
500
400
300
200
100
0
0
1
2
3
4
5
6
7
Index of Difficulty (bits)
Input no longer dependent on
timely output.
Could allow users to function when
system is slow.
11
Feedback Study
900
Mean Time (msec)
The cursor and combined
conditions took more time
(~6%) but resulted in less
errors (~30% for high ID
only).
800
Audio
700
None
Cursor
600
500
400
300
200
Number of Votes
14
100
12
Best
10
Worst
0
0
8
1
2
3
4
5
6
7
Index of Difficulty (bits)
6
4
2
0
Cursor
Audio
Both
Feedback Mechenism
None
Interface design could
reduce the role of the
cursor for some tasks.
12
Conclusion and Future Work
DMI has very good performance characteristics for
selecting stationary targets.
Understanding performance is important to
expanding use
Expand studies to look at more complex
interactions
Improve accuracy and reliability of the DMI system.
13
Thank You
Q&A