Transcript Document
A Novel Human Centric CPS to Improve Motor/Cognitive
Assessment and Enable Adaptive Rehabilitation
PI: Fillia Makedon; Co-PIs: Vassilis Athitsos, Heng Huang, Dan Popa
Computer Science and Engineering department, University of Texas at Arlington
Abstract
Cerebral Palsy (CP) is a group of disorders that can involve brain and
nervous system functions, such as movement, learning, hearing,
seeing, and thinking.
This project will research new methods and tools in motor/cognitive
assessment for small children (5-8 years old) with Cerebral Palsy (CP).
It will develop a multimodal adaptive game system called CPLAY that
integrates multiple views of cyber and physical components, and
provides an assessment mechanism of rehabilitation progression
through game activity monitoring as well rehabilitation.
User friendly Robotic Actuators
• Novel Human-Robot Interaction Systems to promote engagement
• It can interact with children in three modes:
• Interactive Mode (responds to user actions),
• Teaching Mode (the robot leads the child to perform exercises)
and
• Reward Mode (the robot rewards appropriate action)
Sample Games and Metrics
Bubble Popping: basic games has balloons
floating across a touch screen from bottom
left to upper right. The user tries to
catch/break as many balloons as possible
by touching the computer touch screen
While the subject is playing to guide the generation of events A
range of metrics are extracted: Score: The number of points won
Delay of response, Accuracy/Precision: how close the touch is to
the center and Positioning: The reach of the user, it is used to
identify certain areas that have a higher success.
Baking Game
Maze exploration.
Gesture mimicking – Data glove.
Selected Publications
"Modeling the Effect of Attention Deficit in Game-Based Motor Ability Assessment of Cerebral Palsy Patients" ,
M. Gardner, V. Metsis, E. Becker and F. Makedon, To appear in the proceedings of the 6th Workshop on Affect
and Behaviour Related Assistance, in PETRA 2013.
"Hands-Free Human Computer Interaction Framework with a Dialogue System", Georgios Galatas, Alexandros
Papangelis, Fillia Makedon, MHCI 2013
"Application of Data Mining Techniques to Determine Patient Satisfaction", Georgios Galatas, Dimitrios Zikos,
Fillia Makedon, PETRA 2013.
"Robust multi-modal speech recognition in two languages utilizing video and distance information from the
Kinect", Georgios Galatas, Gerasimos Potamianos, Fillia Makedon, International Conference on Human
Computer Interaction - HCI 2013.
Robot Zeno and child waving and motion capture based on
markers for kinematic data
Zeno therapist controlled interaction.
Motion Tracking
Modeling of attention deficit in
children with CP and its effect
Cerebral Palsy is frequently
associated with diagnosed Attention
Deficit (Hyperactive) Disorder
(ADD/ADHD), or easily get
distracted. we perform attention
deficit simulation experiments with
able-bodied users playing three
rehabilitative games as well as similar
computer generated data and propose
a methodology to model and eliminate
the effects of attention deficit or
distraction from the scoring scheme
used to evaluate the patient's motor
abilities and progress over time.
Bubble: Matrix Plot of % Attention, Starts, Att. Time, Dist. Time, Score
0
4
8
0
20
40
1.0
0.5
Lev el
1
2
3
% Attention
0.0
8
4
Starts
0
40
20
Att. Time
0
40
20
Dist. Time
0
500
Score
0
-500
0.0
0.5
1.0
0
20
40
-500
0
500
Control and Visualization Interface (CVI)
Interactive games designed using the Kinect RGB-D sensor
• Low-cost sensor allows for real-time 3D skeletal tracking, gesture
recognition
• Game activities can be customized to encourage therapeutic exercises
as prescribed by therapists
Eye-Tracking Data
• Low cost Head-mounted eye tracking devices designed to map user
gaze vector to 3D points in space to cluster and identify objects of
interest.
• Eye tracking data is collected to measure user attention which can
be utilized as a data modality in interactive games.
"Multi-modal object of interest detection using eye gaze and RGB-D cameras," C. McMurrough, J. Rich, C.
Conly, V. Athitsos, and F. Makedon, in Proceedings of the 4th Workshop on Eye Gaze in Intelligent Human
Machine Interaction - Gaze-In '12, 20121-6.
"Efficient Sparse Representation Using Adaptive Clustering." Soheil Shafiee, Farhad Kamangar, Vassilis
Athitsos, and Junzhou Huang, International Conference on Image Processing, Computer Vision, and Pattern
Recognition, July 2013.
"Emotion Detection via Discriminant Laplacian Embedding", Hua Wang, Heng Huang, Fillia
Makedon. International Journal Universal Access in the Information Society, Springer, pp. 1-9, 2013.
Website : http://heracleia.uta.edu/projects/cplay/
Data Analysis and Visualization
Eye tracking & object
detection device.
Point of Gaze and
surrounding environment.
Point Cloud Model.
Detected object.
CPLAY CVI is a remotely accessible, web based interface that provides
different access rights to different user types. For example, enables a
therapist to make changes such as, choose the rehabilitation plan, the type
of rehabilitation device or protocol used, view a visual summary and
analysis of the person’s performance/behavior so far, search for how the
person performed before or after medication, or during different times of
the day, choose game activity, search for similar cases, get help from an
expert in making decisions, choose the set of human sensing data to
collect, choose an avatar type, to enable remote therapy, and to choose
secure communication facilities. For the patients/users of the system, CVI
is a visual interface module that can display examples of prescribed
exercises, charts with past scores, rehabilitation progress, etc. CVI will
also provide access to administrators/researchers to visualize, download
and share collected data and upload new software modules of the system
and new applications such as games.