Cornell Cup Mid-Review: Intel Autobot Project

Download Report

Transcript Cornell Cup Mid-Review: Intel Autobot Project

Team: ASU TechPriests
Members: Garret Walliman, Samantha Axtell,
Riky Ringer, Hien Nguyen, Austin Noel
Sponsor: Dr. Yinong Chen
Arizona State University
Project Abstract
Our goal is to improve the robot’s existing services and
create new ones, to the end of connecting the robot’s
services to the web and hooking them up to specially
designed “algorithm-centric” user interfaces through which
young students can easily program the robot.
Challenge Definition
The Problem:
Today’s best tools for introducing students to programming are
not good enough.
• Too much conceptual / technical overhead.
• Simple programming exercises are rarely interesting or enthralling.
• First programming assignments focus on language and not algorithm.
Challenge Definition
The Solution:
Create a new, simple UI used to program a robot, which:
•
Eliminates all technical / language overhead.
•
Focuses solely on the algorithm design!
•
Provides immediate, interesting feedback (by controlling a large robot).
The Intel Autobot: Hardware
The Intel Autobot: Software
The Autobot’s main functionality can be broken down into
various services:
• Motor / Motor Commands
• Sonar
• Directional Sensor
• Kinect Depth Tracker (our service!)
These services will be made available on the web and
accessed through a simulator and algorithm-centric UI.
What is an Algorithm-Centric UI?
“An ‘algorithm-centric UI’ is a high-level programming
environment where the act of programming is reduced to
manipulating predefined conditions and outcomes to
accomplish some simple – although non-obvious – task”.
This UI will allow students to remotely create an algorithm to
control the robot.
It can be used as either a simulator or to control the actual
robot, which gives it flexibility enough for classroom use.
http://venus.eas.asu.edu/WSRepository/RaaS/MazeNav/
What Have We Done?
1. Robot code cleanup – documented API for robot’s
current code.
1. Investigated and documented legacy code.
2. Improved needed legacy code, jettisoned unneeded.
2. Developed Kinect Depth Tracker Service.
1. Added Microsoft Kinect to robot.
2. Created depth tracking service to improve path finding
ability.
3. Developed patrolling algorithm.
1. Simple wall-following maze algorithm utilizing Kinect and
Sonar sensors.
2. Developed as proof-of-concept of Kinect.
Kinect Depth Tracker Service
As our first service, we created this to:
1. Improve the Autobot’s capability.
2. Familiarize ourselves with the robot and with SOA programming overall.
How does it work?
• Service splits Kinect field-of-view into N sections (configurable).
• Each section’s depth values are consistently averaged.
• Averages are rounded and returned multiple times per second.
Kinect Depth Tracker: Far Right
Kinect Depth Tracker: Middle
Kinect Depth Tracker: Far Left
Current Project Status
1. Team has thorough understanding of robot code / functionality, and a
decent understanding of SOA architecture / SOC paradigm.
2. Robot functions mechanically well, services are running bug-free.
3. Simulator / UI code has been written.
Team is now ready to proceed with linking the robot
with the UI.
Project Timeline: January - February
January:
• Convert current DSS services to web services.
• Improve existing Motor / Motor Commands / DepthTracker services.
• Set up Windows Server 2008 on web server.
•
We will use server to store simulator UI.
February
• Document using server / SOAP.
•
Gather knowledge for linking up simulator with robot.
• Cleanup/improve simulator code.
•
Gain fuller understanding of how simulator works.
• Write and test controlling robot via web services.
Project Timeline: March - April
March:
• Connect web services to simulator and test.
•
Critical step!
• Develop small website to wrap simulator in.
April:
• Develop new simulators with new tasks.
• Performance test simulator with students.
•
We hope to use the primary audience – elementary or middle school students for
this, but if this is not possible we will use classmates.
Conclusion
We hope to use our project to fill a void in CS education:
• Simple introductory projects are not interesting.
• Interesting projects are too hard.
Our robot will aim to be both simple to use and highly
interesting to young students.
We have all but finished the hardware / service side of things.
We now need only to:
• Link up the UI with the robot.
• Create new simulators / algorithm UIs to use with the robot.
• Test!
Question and Answer