ANFIS based Controller Design for Biped Robots

Download Report

Transcript ANFIS based Controller Design for Biped Robots

2008 數位互動科技與產業應用研討會
互動表情呈現機器人之技術及趨勢
Approaches to Interactive Emotional Robots
謝銘原
南台科技大學 機器人研究中心
Robotics Research Center, Southern Taiwan University, Taiwan
Approaches to InteractiveIEEE
Emotional
IECONRobots
2007
Outline

Introduction to Interactive Emotional Robots

Famous Researches on Emotional Robots




KISMET
Hanson Robotics
Discussions on related technologies
Conclusions
2
Approaches to InteractiveIEEE
Emotional
IECONRobots
2007
Introduction

Interactive Emotional Robots

Anthropomorphic robots display a unique, artificial
subconscious, partly due to their




cognitive understanding of language-based interactive speech,
conversational capabilities and genuine eye contact,
coupled with a full range of human facial expressions.
The key technologies consist of


Anthropomorphic artificial musculature and skin
A.I. software
3
Approaches to InteractiveIEEE
Emotional
IECONRobots
2007
Famous Researches on Emotional Robots
1/2

Kismet (MIT)


She can engage people in natural and
expressive face-to-face interaction
Reddy (RoboMotio)

By combining facial expression to
arms movement, it easily express a
wide range of emotions like joy, anger,
sadness, surprise or disgust
4
Approaches to InteractiveIEEE
Emotional
IECONRobots
2007
Famous Researches on Emotional Robots 2/2

Hanson Robotics

The Albert-Hubo -- A collaboration of East
and West


Eva -- The best of all worlds


The Albert Hubo is the first ever walking robot
with realistic, humanlike expressions
Eva is a humanlike robot of universal beauty
achieved by incorporating a mixture of ethnic
characteristics
Zeno -- The smartest and coolest robot


Zeno is the first of his kind.
Zeno lives in the “Inventing Academy” in the
year 2027 with a whole group of other robot
kids, learning and fighting to save humanity.
5
Approaches to InteractiveIEEE
Emotional
IECONRobots
2007
Kismet – 1/5

The Hardware Design





The high-level perception system,
the motivation system,
the behavior system,
the motor skill system,
the face motor system
6
Approaches to InteractiveIEEE
Emotional
IECONRobots
2007
Kismet – 2/5

Vision System

Auditory System

Expressive Motor System


a 15 DoF face
Vocalization System
7
Approaches to InteractiveIEEE
Emotional
IECONRobots
2007
Kismet – 3/5
8
Approaches to InteractiveIEEE
Emotional
IECONRobots
2007
Kismet – 4/5

Social Amplification
9
Approaches to InteractiveIEEE
Emotional
IECONRobots
2007
Kismet – 5/5

Facial expressions

Looking at

Searching the objective

Affective responses
10
Approaches to InteractiveIEEE
Emotional
IECONRobots
2007
Hanson Robotics – 1/5

Each HumanKind robot is


individually hand-crafted
built to perform in a wide variety
of applications, including




Entertainment
Research
Animation
Consumer households
11
Approaches to InteractiveIEEE
Emotional
IECONRobots
2007
Hanson Robotics – 2/5

All HumanKind robots have the
following capabilities:

Emulate over 62 facial and neck muscles


Embedded micro-cameras


providing anthropomorphic facial expression
providing vision recognition
A.I. software technology for



face and speech recognition,
eye tracking, and
conversational operations
12
Approaches to InteractiveIEEE
Emotional
IECONRobots
2007
Hanson Robotics – 3/5



Capable for local/remote pupeteering
Portable and can run on up to 1/20 the power required for
comparable products
Interface with standard computers


included with robot
Can function in a variety of research environments,
including



Computer vision,
Computational interaction, and
Speech perception
13
Approaches to InteractiveIEEE
Emotional
IECONRobots
2007
Hanson Robotics – 4/5

Zeno robot – 17” tall, weigh 6 lbs.
battery power
 learns through artificial intelligence
 A character robot that can see, hear, talk and
remembers who you are
 Wirelessly controlled by a PC

He can view a 3D mental image of his environment to
determine and control physical action and reactions, much
like we do as humans.

He then has the ability to navigate, make facial expressions and
move his body based on what he sees around him.
14
Approaches to InteractiveIEEE
Emotional
IECONRobots
2007
Hanson Robotics – 5/5

A character engine with speech recognition and
conversational AI for language reasoning so that

Zeno can recognize and remember both speech and
faces and interact accordingly
15
Approaches to InteractiveIEEE
Emotional
IECONRobots
2007
Discussions on related technologies

The key technologies for emotional expression
consist of


Anthropomorphic artificial musculature and skin
A.I. technologies
(ex. Character Engine software in Hanson Robotics
HumanKind Robots)
16
Approaches to InteractiveIEEE
Emotional
IECONRobots
2007
Human musculature of the face
17
Approaches to InteractiveIEEE
Emotional
IECONRobots
2007
Artificial musculature

[US Patent 7,113,848, Sep. 26, 2006]
The head of a typical HumanKindTM robot consists of 32 DOF
to simulate human musculature in the face and the neck,
there are:
(1) 4 DOF in the neck (turn, tilt, nod-upper,
nod-lower)
(2) 3 DOF in the eyes (left eye turn, right eye
turn, eyes up and down)
(3) 1 DOF for the jaw
(4) 3 DOF for the eyelids (2 upper eyelids,
coupled lower eyelids)
(5) 21 DOF servos in the face:
smile left, smile right, frown left, frown right,
“ee”left+right, lower lip center up+out, upper
lip center, lower lip ¾ left+right, upper lip ¾
left+right, sneers, eye-scrunches left+right,
outer brows left+right, inner brow left+right, and brow center
18
Approaches to InteractiveIEEE
Emotional
IECONRobots
2007
Artificial skin

How to get an anthropomorphic skin?
[US Patent 7,113,848, Sep. 26, 2006]
19
Approaches to InteractiveIEEE
Emotional
IECONRobots
2007
Artificial musculature and skin

To simulate musculature and skin
[US Patent 7,113,848, Sep. 26, 2006]
20
Approaches to InteractiveIEEE
Emotional
IECONRobots
2007
Artificial mouth and eyes

The dynamic actions of
the artificial lips, the artificial eyes, and
the artificial eyelids
[US Patent 7,113,848, Sep. 26, 2006]
21
Approaches to InteractiveIEEE
Emotional
IECONRobots
2007
Artificial Intelligence of Emotional system

Emotional expression


Happy, sad, afraid, disgusted, angry, surprise,
contemplative, confused …
A.I. system integrates these techniques:






computer vision,
face detection and identification,
speech recognition,
natural language processing,
speech synthesis, and
motion control
22
Approaches to InteractiveIEEE
Emotional
IECONRobots
2007
The future of emotional robots

The challenges

More anthropopathic facial expression with real emotions




More anthropopathic skin



Emotion detection and recognition
Micro servo actuators and their controls
More DOFs
Bionic material
Sensory, reflective, sensitive to force, pressure and temperature
More intelligent



Integration of multi-perceptions and recognitions
Learning and emulating capabilities
Simple but powerful algorithms
23
Approaches to InteractiveIEEE
Emotional
IECONRobots
2007
Conclusions

Interactive emotional robots need to be developed
with

Learning social behaviors during human-robot play



Kismet
To imitate infant and copy the intelligence under nature behavior
Flexible facial framework to display anthropopathic
expressions


Hanson Robotics
To emulate human facial musculature and skin and set up
sufficient preprogrammed basic facial expressions
ZENO
24
Approaches to InteractiveIEEE
Emotional
IECONRobots
2007
Thanks for your attendance
25