CareyPresentationQ1

Download Report

Transcript CareyPresentationQ1

The Implementation of a
Glove-Based User Interface
Chris Carey
Abstract
•
•
•
•
Advantages of multi-touch interfaces
are being utilized in numerous
applications
This project aims to go a step further
A glove-based interface provides the
utility of a multi-touch interface
without the proximity restriction
A more natural human-computer
interaction can also improve
efficiency with complicated tasks.
Background
•
Why now?
–
–
–
–
Accessibility of Technology
Increased Application Sophistication
Usage in Restrictive Environments
Why not?
Past and Current Systems
•
Glove Systems
–
–
–
•
Haptic Gloves and VR Systems
Full Motion Capture Glove Systems
Basic Wiimote Glove Systems
Non-Glove Systems
–
–
Neural Network Hand Gesture
Recognition
3D Model Reconstruction Gesture
Recognition
Project Goals
•
Focuses:
–
–
–
–
Speed
Accuracy
Task Simplification
Improved User Experience
Hardware Implementation
•
Logitech Webcam
–
–
•
IR-blocking filter removed
Visible-light blocking filter added
IR LED Glove
–
–
3 IR LEDs
2-1.5V button cell batteries
Software Implementation
•
•
•
•
•
Java and Java Media Framework
Custom LED Detection
LED Tracking
Gesture Recognition
Command Execution
LED Detection
•
•
•
Binary Rasterization
Brightness Threshold Determination
Blob Comparison
LED Tracking
•
LED object class
–
–
•
Records previous positions and velocities
Predicts next position for faster location
Balance between detected LEDs and
tracked LEDs
Gesture Recognition
•
Static Gestures
–
–
•
Do not depend on absolute location
Performed and executed once
Dynamic Gestures
–
–
Do depend on absolute location
Performed and executed continuously
Static Gesture: Minimize
•
•
Decreasing distance between three
LEDs
java.awt.Robot class executes
keystroke ALT+SPACE + ‘n’
Dynamic Gesture: Mouse Pointer
•
•
Tracks LED with greatest y-value
Executed when no other gesture is
recognized
Dynamic Gesture: Drag and Drop
•
•
•
Distance between two LEDs
decreasing at minimum velocity
DRAG: Minimum distance between
maintained
DROP: Distance between exceeds
minimum distance
Planned Gestures
•
Single Hand Gestures
–
–
–
•
Mouse Click
Mouse Scroll
Window Maximize/Restore
Two Handed Gestures
–
–
–
–
Window Selection
Object Resize
Object Zoom
Object Rotate
Preliminary Analysis
•
Speed
–
–
•
Mode length of time for each iteration:
47 ms
Slower than necessary 24fps (41ms)
required to bypass human perception of
real events
Accuracy
–
–
Poor LED detection has led to poor
gesture recognition
Brighter LEDs or stronger camera
necessary
Possible Solutions
•
•
•
•
Brighter IR LEDs
LED pulse driving circuit
Webcam with night vision
IR narrow band pass filter
Work Remaining
•
•
•
•
•
Improved hardware
Refined LED detection/tracking
Quicker processing
Increased gesture support
Application Control
Conclusions
•
•
•
•
•
Speed and accuracy still an issue
Minimize static gesture simplifies task
when compared to mouse interface
Glove interface constantly receives IR
light
Multi-touch gestures activate when IR
light activates
Requirement of multi-touch interfaces
for direct contact ensures consistency