Event - Indiana University
Download
Report
Transcript Event - Indiana University
Visual Exploration and Analysis of
Human-Robot Interaction Rules
Hui Zhang and Mike Boyles
Presenter: Hui Zhang
[email protected]
Presented at SPIE EI Conference, 4 February 2013
Available from: http://www.cs.indiana.edu/~huizhang
© Trustees of Indiana University
Released under Creative Commons 3.0
unported license; license terms on last slide.
Visual Exploration and Analysis of
Human-Robot Interaction Rules
Hui Zhang and Mike Boyles
Presenter: Hui Zhang
[email protected]
Presented at SPIE EI Conference, 4 February 2013
Available from: http://www.cs.indiana.edu/~huizhang
© Trustees of Indiana University
Released under Creative Commons 3.0
unported license; license terms on last slide.
This paper is about:
• How visual language interface can help behavior data
researcher to design/manage event(signal)-driven
interactions
– Human robot interaction context
• How to use visualization to simulate interacting social
agents
– Signals/events -> color-coded ROIs
– Time-sequence
– Joint attention,
• Visual mining of multi-stream multimodal data
– Typical duration of events
– What is typical “lead-to” relationship among events
Background
• Use HRI to understand the ground truth
of social interaction
• Systematic manipulate robot/avatar’s
responsive behavior (with
programmed patterns)
• Investigate how human partners
adapt their behaviors when interacting
with robot/avatar agents
Motivation
• A novel interaction paradigm to design,
simulate HRI study and to investigate
empirical date with visual analysis
techniques
– Visual programming interface to
manipulate the triggering among
signals and events;
– Simulation of interacting agents with
information visualization;
– Data mining, visual analysis to examine
empirical data
Implementation Details
• Event, Action, Trigger represented as link-node
diagram
Implementation Details
• Event, Action, Trigger represented as link-node
diagram
– Event. Events are rendered as interactive visual forms
representing sensory signals or timer expiration signals
that can be captured in the robot’s perceptual
interface.
– Action. Actions have a visual syntax that allows
manipulations through their combination of textual
and pictorial forms.
– Trigger. Users explicitly draw directed links to define
the triggering relationships between an event and an
action.
Implementation Details
• Example
Implementation Details
• Primitive events provided from the toolbox can
be combined with others to generate
customized more complex events
Implementation Details
• Visual debugging by walking through the
trigger relation graph
Implementation Details
Implementation Details
• Visual analysis of multi-stream multimodal data
Implementation Details
• Visual analysis of multi-stream multimodal data
Implementation Details
• Visual mining
– Sequential relationship between event
• Which event typically leads to which?
• What events are typically grouped together?
– Quantitative Timing Information
• What is the typical interval value for this event?
• What is the typical length of event A if followed by B?
Implementation Details
• Visual mining
– Quantitative temporal mining algorithm
(QTempIntMiner)
Implementation Details
• Visual mining
– Event timing information
Implementation Details
• Visual mining
– Sequential adaptive behavior sample
Summary
• Transform HRI designs to meta-level user interface
• A family visual analysis techniques to simulate, debug, and
investigate trigger relationship in HRI
• Integrated visual mining tool to understand the real data
• All these form a informative close loop
Thank you!
Question?