01 February 2014

Science Fair and Texas Torque, FIRST Robotics World Chapmions

I spent today at the a local science fair event. There are more of them spread over the next few weekends. Today's had an elementary school festival and science / math bowl competitions. Next week is the Junior High and, the week after, the Senior High Engineering Design Competition. These both involve building robots.

There was also a robotics demonstration today  and I volunteered for it, naturally. The main activity was the Texas Torque team demonstrating their robot, which won last year's FIRST World Championship. It also was one of the teams leading the Macy's Thanksgiving day parade last November. Last year's challenge was to throw Frisbees through rectangles at one end of the arena.

Not only is the robot impressive but so are the team members. I spoke with Robert and Matthew, mainly, but a couple others approached me to see if I had questions. This competition is for high-school students and, to be blunt, you have to be impressed by their poise and ability to engage with adults and children about their team and robot.

I spoke also with one of the fathers and he related how these young people had to learn to work together. They are all very smart and quite used to having correct answers. Obviously when building a robot like this all their answers cannot be correct so a big learning experience is compromising and considering other ideas.

Participating in this team provided experience in team-work that most students only get through organized sports in high school. Unfortunately in Texas that also comes with adulation that is not always beneficial. This team experience is more beneficial, I believe.

Now on to the robot...

24 January 2014

Team Waterloo Research Paper on SRR

Team Waterloo published about their work on a robot for the 2012 and 2013 NASA Sample Return Robot Centennial Challenges.
Mapping, Planning, and Sample Detection Strategies for Autonomous Exploration
This paper presents algorithmic advances and field trial results for autonomous exploration and proposes a solution to perform simultaneous localization and mapping (SLAM), complete coverage, and object detection without relying on GPS or magnetometer data. We demonstrate an integrated approach to the exploration problem, and we make specific contributions in terms of mapping, planning, and sample detection strategies that run in real-time on our custom platform. Field tests demonstrate reliable performance for each of these three main components of the system individually, and high-fidelity simulation based on recorded data playback demonstrates the viability of the complete solution as applied to the 2013 NASA Sample Return Robot Challenge.
 It is the Journal of Field Robotics in the Wiley Online Library. I could, and probably will, spend a lot of time with the back issues in the Journal.

14 January 2014

Accelerating SRR Development While Gyrating Wildly

Decoding the title, I am experimenting with the Phidgets 1042 spatial sensor, also known as an Inertial Measurement Unit (IMU). This IMU contains an accelerometer, a gyroscope, and a compass. The compass is not allowed for competing in the SRR so it is being ignored.

I worked with this IMU for the 2013 SRR but could not get the results needed so I put it aside. Since the first of the year and getting more serious about the 2014 SRR, I began working with it more.

As I did last year, I began working with code I found that would fuse the accelerometer and gyroscope data into a single reading of the global pose of the robot. The results never came out correct. The main problem was the reading for bearing, primarily based on the gyroscope data, was inaccurate. I setup a servo to rotate the IMU through 90 degrees (or a fairly close approximation) but the results usually were less, somewhere in the mid-80 degree range.

After fussing with the code I decided to try a really basic test. First, some background information.

A gyroscope of this nature reports the angular change during a period of time. Specifically, this IMU reports degrees / second and the amount of time between readings. Multiplying the reading by the amount of time tells you the actual rotational movement for that period. Integrating those results provides the current angular position of the IMU. Thus:
\[
\large \theta_{t} = \sum \omega_{t}dt \\
\theta_{t} \text{ is angular movement at time } t  \]
The test is simple. Setup the servo to rotate 90 degrees with the IMU flat. The z axis, up and down, of the gyro should have a rotation of 90 degrees following the equation above.

14 December 2013

Revising the SRR Web Material

I have been fighting with fatigue since the June SRR competition. Finally seem to be overcoming it with some medications, better sleep habits, and who knows what else that may be making it better. As a result I am reworking and rethinking the Sample Return Challenge on my website. New material is under 2014 Table of Contents. Pages I am working on have WIP (work in progress) in their title. Comments and suggestions are appreciated here, via email, or on Facebook.

Happy holidays to all of you. May the robot of your desires be under the tree.

07 August 2013

Programming Languages Used for SRR

I asked at the SRR Challenge about the languages and vision processing used by each team. Here is what I found:

Team                     Language                       Vision Processing
Intrepid                         C++ / Matlab                                          
Kuukulgur                     C++                                          OpenCV
Mystic                           C++                                          RobotRealm
SpacePride                    RoboRealm state machine          RoboRealm
Survey                           C++, Python                             OpenCV
Middleman                     LabView                                   LabView
UCSC                           C/C++                                      OpenCV
Waterloo                       C++, Python                                            
WPI                              C++                                                             
Wunderkammer             Python                                      ROS vision packages

Here is a rough synopsis of how the teams fared:

Team Intrepid was the first to leave and return to the platform. It thought it picked up the sample but actually did not.

Team Kuukulgur (it means Moon or Lunar Rover), a demonstration team, from Estonia, was the first to pick up the sample but did not make it back to the starting platform. They had the slickest looking robots but then three of the team are mechanical engineers. They brought a swarm of four but one failed so only three took the field.

Team Waterloo, a demonstration team from Canada, also picked up the sample and were the first to return it to the starting platform but the sample was just outside the 1.5 meter square area. It did not hurt them financially since they were a demonstration team and thus ineligible for the NASA money. They did receive $500 from WPI for picking up the sample.

Team Survey won the Phase I competition this year and will take home $6,000 for that effort. ($5,000 from NASA and $1,000 from WPI.)

Team Mystic Lake, myself, did not do that well but I consider it a "building year", to borrow from sports team terminology. Mystic Two traveled the furthest distance of a robot in the SRR to date. It just kept trekking. I proved out much of my code and the ability of my very small rovers to handle the terrain.

SpacePride fielded two rovers but were unable to accomplish much. Their software developer dropped out near the end so they had to scramble to get something working via a state-machine in RoboRealm.

I will update the table if more information becomes available.

Just after I returned from the challenge, an email on a robotics mailing list asked for advice on languages to use for robots. Since I had almost all of the information posted above I put it into a reply and received a nice thanks in return. Hopefully someone will find this interesting.

(Updated UCSC - UC Santa Cruz from comment. Thanks.)
(Updated Middleman aka RoBear from comment. Thanks.)



18 June 2013

Linux Sucks

One of the to-do items from the SRR challenge was to learn Linux so I could investigate Robot Operating System and then use OpenCV for vision processing.

Linux does not make this easy and I am rapidly getting frustrated with it as I always do when I try it.

I downloaded and installed Debian Wheezy to get the latest version. It is up and running.

I used the package manger to install Eclipse CDT. Except it is a version of Indigo (3.8) not Juno. Grump, grump, grump. I tried to update Eclipse the other day on another box. There are NO clear instructions on the web on how to do the install over the old version and get it into the menu system.

At the same time I wanted to install Mozilla Thunderbird for email. Nope, not in the package manager and it downloads an archive, not a package. Grump, grump, grump.

Google Chrome downloaded okay and with only one false start I got the package manager to install it.

Even should someone walk me through getting those applications installed I shudder to think of what it is going to take to (1) get ROS and OpenCV setup and (2) get Debian running on all the rover's PCs and running my code.

17 June 2013

Rover Names

When I got the 4 computers for the rovers I still had not come up with a good names for them. Loading Windows XP I simply called the first one Mystic One. That led to the others being Two, Three, and Four. The more I used that name I liked it and started referring to them collectively as The Mystics. That is their name now and I will use if for the team name next year: Team Mystic.

The name obviously comes from Mystic Lake Software, my DBA. After Shari and I decided I was retired - I promised to have a warm supper on the table for her every night - I wanted to create a DBA just in case I did pick up some kind of work. Behind our house - and across a street - is a small park with a lake - Mystic Lake. That felt like a good name so I used it by simply adding the "Software". It conveys that aura of mystery that pervades software.

SRC2 - Explicit Steering - Wheel Speed

SRC2 Rover This fourth post about the  qualifying round of the NASA  Space Robotics Challenge - Phase 2  (SRC2) addresses t he speed of the ...