Part V : Ease of Use Chapter 19: Advanced Hands-and
Download
Report
Transcript Part V : Ease of Use Chapter 19: Advanced Hands-and
Ubiquitous Computing
Max Mühlhäuser, Iryna Gurevych (Editors)
Part V : Ease of Use
Chapter 19: Advanced Hands-and-Eyes Interaction
Michael Weber, Marc Hermann
Ubiquitous
Computing
Introduction
• Interaction in real world
– major principle: combined use of
• human vision system
• motor system
• Hand and eye coordination dominant in computer use nowadays
– visual display with action elements visualized
– keyboard to enter text and symbols
– pointing device (mouse)
• place interaction focus
• click on / activate interaction elements
Advanced Hands-and-Eyes Interaction :
2
Ubiquitous
Computing
CMN Model Human Processor
• Card, Moran and Newell (1983)
• Simplified model of human as information processor
• Hands and eyes interaction:
– visual output seen with eyes
– physical light and color
stimuli in perceptual processor
– short-term visual memory
stores impressions
– content interpreted by
cognitive processor
– decision on storing
or reacting on stimuli
– instruct motor system
– move hand
(e.g. position mousepointer)
Advanced Hands-and-Eyes Interaction :
3
Ubiquitous
Computing
Speed of Cognitive System
• Empirical studies
– quantify capacities of cognitive system
– measure durations
• Durations
- perceptual system -> short term memory : 100 ms
- interpreting impression in visual cortex : 170 ms
- inform motor system and move hand : 230 ms
Advanced Hands-and-Eyes Interaction :
4
Ubiquitous
Computing
Fitts’ Law
• Observations of pointing tasks
with varying movement distances
• Experiments on time needed for
movement of hand to target
– distance D
– target size S
• Results formulated as linear function for movement time (MT)
– a and b experimentally derived values of human performance
2D
2D
where
index
of
difficulty
I
log
MT a b log 2
d
2
S
S
Advanced Hands-and-Eyes Interaction :
5
Ubiquitous
Computing
Human Interaction Cycle (Norman)
Process and action oriented view
1.
2.
3.
4.
5.
6.
7.
Forming the goal
Forming the intention
Specifying an action
Executing an action
Perceiving the state of the world
Interpreting the state of the world
Evaluating the outcome
Advanced Hands-and-Eyes Interaction :
6
Ubiquitous
Computing
Traditional desktop computer interaction
• Keyboard
– major input device for text and symbols
• Mouse
– pointing device
– other pointing devices also in use (joystick, touch pads, styluses)
• Screen
– output device
-> Interaction obeys Fitts' Law
-> Developer needs to find adequate affordances, mappings and
constraints
Advanced Hands-and-Eyes Interaction :
7
Ubiquitous
Computing
Towards Ubiquity
Diversification of devices <-> Convergence of functionality
• Everything happens in physical 3-D world
• Need for
–
–
–
–
enhanced mobility
flexibility
context and situation dependence
new interaction metaphors
• Shift from explicit interaction towards implicit interaction
– context of use and situation
– more natural interaction
Advanced Hands-and-Eyes Interaction :
8
Ubiquitous
Computing
Multiscale Output
• Visual displays
– present information to user
– all sizes possible in ubiquitous computing
– characteristics for case of ubiquitous computing
• Device types
–
–
–
–
–
–
–
–
–
Monitors
Projective displays
Surround-screen displays
Head-mounted displays
Virtual retinal displays
Handheld displays
Haptic displays
Auditory displays
Peripheral displays
Advanced Hands-and-Eyes Interaction :
9
Ubiquitous
Computing
Visual display characteristics
• Field of regard
– visual angle of display
surrounding user
• Field of view
– angle seen by user
at one instance in time
• Spatial resolution
– function of pixel size
– depends on number of pixels, screen size and viewing distance
• Light transfer
– front, rear projection or head-mounted display?
• Display mobility
– Is display mobile or stationary (e.g. size)?
Advanced Hands-and-Eyes Interaction :
10
Ubiquitous
Computing
Projective Displays
• Visual output on large areas
– field of regard and field of view extended
– spatial resolution limited (pixels physically large)
• Front projection (a)
– projector on same side as viewer
– shadows can be cast by viewer
• Rear projection (b)
– projector behind screen
– no shadows
Advanced Hands-and-Eyes Interaction :
11
Ubiquitous
Computing
•
•
•
•
Surround-screen Displays
Three or more projection-based screens
Typically rear-projected
Large field of regard and field of view
Immersive perception
Advanced Hands-and-Eyes Interaction :
12
Ubiquitous
Computing
Head-mounted displays (HMDs)
• Attached to viewer's head
• Fully virtual display (a)
– blocks out real world
– virtual computer-generated
environment
• Video see-through display (b)
– captures real world scene
– real world scene is superimposed
onto virtual scene
• Optical see-through display (c)
– semi transparent mirrors
– real world seen through mirrors
– virtual scene displayed on mirrors
Advanced Hands-and-Eyes Interaction :
13
Ubiquitous
Computing
Output device types - Summary
Output Devices
Attributes
Monitors
stationary, high resolution
projective displays
stationary, limited spatial resolution
surround-screen displays stationary, large field of regard and field of view
head-mounted displays
Portable
a) Fully virtual view: field of view restricted to
virtual environment
b) Video see-through: vision restricted by quality of
video
c) Optical see-through: field of view and peripheral
vision not much restricted
haptic displays
perceive surface textures or temperature, wearable
auditory displays
360° field of regard with 3-D sound
peripheral displays
perceivable without being in the field of view
Advanced Hands-and-Eyes Interaction :
14
Ubiquitous
Computing
Input device characteristics
• degree of freedom (DOF)
– number of independent parameters provided
– e.g. mouse: x and y values = 2 DOF
• discrete vs. continuous input
–
–
–
–
frequency with which data values are provided
triggering of when data values are provided
discrete: single data value when action is performed
continuous: stream of data mostly without specific action
• active vs. passive input
– active: user actively operates input device
– passive: delivers captured data values all the time
Advanced Hands-and-Eyes Interaction :
15
Ubiquitous
Computing
Input device types - Summary
Input device
Attributes
Keyboards
discrete, active
2-D mice and pointers
discrete and continuous components
tracking devices
position and orientation detection
3-D mice
hybrid of 2-D mice and tracking devices
Advanced Hands-and-Eyes Interaction :
16
Ubiquitous
Computing
Integration of physical and virtual world
• Virtuality continuum (Milgram and Kishino)
–
–
–
–
–
real world with real objects (Real Environment)
computer-generated virtual environments (Virtual Reality)
mixture of both = Mixed Reality
real world augmented with some virtual objects = Augmented Reality
virtual world augmented with some real objects (e.g. users walking in the
virtual world) = Augmented Virtuality
Advanced Hands-and-Eyes Interaction :
17
Ubiquitous
Computing
Augmented Reality
• Combination of real and virtual
• Interactive in real time
• Registered in 3-D (virtual objects registered in real world
• Tracking
– head tracking (head-mounted devices on persons)
– object tracking (position and orientation of real objects)
Advanced Hands-and-Eyes Interaction :
18
Ubiquitous
Computing
Characterization of Tracking Techniques
• refresh rate and latency
– 25 frames/s (human perception)
– imprecise correlation with high latencies
• accuracy
– different resolution depending on application
• reliability
– accumulated errors destroy coordinate correlation
• degree of freedom (DOF)
– 6 DOF for 3 dimensional space
• multi object capability
– one tracking system tracks several objects and persons at once
– statically installed in rooms (e.g. virtual environments)
• portability
–
–
–
–
limited in size and weight
energy consumption
distance range of sensors
environmental conditions (light, magnetic fields, occlusions, etc)
Advanced Hands-and-Eyes Interaction :
19
Ubiquitous
Computing
Towards Ubiquitous Computing
• mobile augmented reality systems
– pedestrian navigation support (Mahler et al)
– game applications (Wagner et al)
– input and output devices small and light-weight (portability)
• future directions
– wearable computing: active input devices in our clothes
• acceptance by users doubtful
– PDAs and smart phones
• enhanced by surrounding equipment
• shared input/output devices in environment
• Weiser's dream
Advanced Hands-and-Eyes Interaction :
20
Ubiquitous
Computing
Attention Detection
• Where does the user look?
– head and eye position
• clustering skin colored areas in video
• Theis et al.
– detecting eye position
Advanced Hands-and-Eyes Interaction :
21
Ubiquitous
Computing
Attention Detection
• Morimoto et al. 2000
– Camera with Infrared LEDs
– Quick detection of multiple pairs of eyes
– Red eye effect (a) and
black eye effect (b) images
– position of corneal reflection
reveals gaze direction
Advanced Hands-and-Eyes Interaction :
22
Ubiquitous
Computing
Attention-Awareness
• Reingold et al.
–
–
–
–
Gaze-Contingent Display
image with blurred regions
high resolution region at user's focus
region of high resolution follows gaze
• Barth et al.
– guidance of eye movement by red dot
– additionally:
•
•
•
•
measuring eye movement
several options anticipated
chance of one option increased by red dot
other options reduced by blurring
Advanced Hands-and-Eyes Interaction :
23
Ubiquitous
Computing
Peripheral awareness
• Move information to periphery
– reduce information overload
– easily monitor information
• Classification of most common terms (Pousman)
– Notification system
• multitasking scenario, multiple output devices
– Peripheral display
• design purpose: information in secondary task
• nearly all peripheral displays are notification systems
– Ambient display
• subclass of peripheral displays
• aesthetic design
• few intuitive states of information
Advanced Hands-and-Eyes Interaction :
24
Ubiquitous
Computing
Affection
• Goal: factoring emotions into the design process of user interfaces
• Three levels of the brain (Norman 2004)
Level
Part of brain
Characteristics of UI
Visceral
Automatic, prewired
Appealing appearance
(visual, sound, haptics,
taste...)
Behavioral
Control of everyday
behavior
Effective (enjoyment
of usage)
Reflective
Contemplative,
reflective
Personal relationship
or challenge
Advanced Hands-and-Eyes Interaction :
25
Ubiquitous
Computing
Tangible Interaction
• digital information represented with real objects
– no mouse and keyboard needed for
• direct manipulation
• usage of information
– behavioral (and in many cases visceral) design
• early example: the Marble Answering Machine
–
–
–
–
designed by TUI pioneer Durell Bishop
telephone answering machine
phone call messages are associated with marbles
haptic interaction process for functions like replay, call-back, storage,
deletion
Advanced Hands-and-Eyes Interaction :
26
Ubiquitous
Computing
Marble Answering Machine
Advanced Hands-and-Eyes Interaction :
27
Ubiquitous
Computing
Tangible User Interfaces
• universal description by Fishkin:
– physical manipulation input event performed by a user
– event sensed by computer system, altering its state
– system gives physical feedback
• Toolkit: Papier-Mâché
– free Java toolkit for prototyping TUI
– input: camera, scanner, RFID reader
• any object can be used
– output: acoustic, on-screen
– event types
• phobAdded: new object detected
• phobUpdated: change of object attributes
• phobRemoved: loss of contact with object
– events can be interpreted by any application
Advanced Hands-and-Eyes Interaction :
28
Ubiquitous
Computing
Gesture Interaction
• Body language = additional interaction technique
• No physical device needed by user
• Examples:
– Fails et al.
• controls projected onto surfaces like armchair, desk etc.
• hand movement/gestures detected by two cameras
– Hardenberg et al.
• finger-finding and hand-posture recognition algorithm
– Ståhl et al.
• model of movement and emotions
– Shape: 3D body movement
– Effort: dynamics of movement – time, space, weight, flow
– Valence: degree of pleasure
• better learning of gestures
• computer can react on emotions of user
Advanced Hands-and-Eyes Interaction :
29
Ubiquitous
Computing
Summary
• hands and eyes interaction
– basics from pre-ubiquitous computing
– one desktop computer with mouse and keyboard
• input and output devices
– types
– characteristics
– beyond desktop metaphor
• new interaction techniques
– attention
– affection
Advanced Hands-and-Eyes Interaction :
30
Ubiquitous
Computing
Future Research Directions
• Calm technology
– reduce interruption by notifications
• e.g. e-mails, SMS, online-chat
• less distraction from primary task
– direct attention to where it was meant to be
• where designers of system intended to have it
• system knows where attention is and can react
• Perceivable risk
– analyzing user and his behavior -> user can be influenced
• e.g. advertising industry
• need of option to switch off (cf. television or Java popups)
• Goals
–
–
–
–
more intuitive and fun to use interfaces
appealing design
useful functions
no disturbance of user in main task
Advanced Hands-and-Eyes Interaction :
31