Skip to main content

Programming Languages Used for SRR

I asked at the SRR Challenge about the languages and vision processing used by each team. Here is what I found:

Team                     Language                       Vision Processing
Intrepid                         C++ / Matlab                                          
Kuukulgur                     C++                                          OpenCV
Mystic                           C++                                          RobotRealm
SpacePride                    RoboRealm state machine          RoboRealm
Survey                           C++, Python                             OpenCV
Middleman                     LabView                                   LabView
UCSC                           C/C++                                      OpenCV
Waterloo                       C++, Python                                            
WPI                              C++                                                             
Wunderkammer             Python                                      ROS vision packages

Here is a rough synopsis of how the teams fared:

Team Intrepid was the first to leave and return to the platform. It thought it picked up the sample but actually did not.

Team Kuukulgur (it means Moon or Lunar Rover), a demonstration team, from Estonia, was the first to pick up the sample but did not make it back to the starting platform. They had the slickest looking robots but then three of the team are mechanical engineers. They brought a swarm of four but one failed so only three took the field.

Team Waterloo, a demonstration team from Canada, also picked up the sample and were the first to return it to the starting platform but the sample was just outside the 1.5 meter square area. It did not hurt them financially since they were a demonstration team and thus ineligible for the NASA money. They did receive $500 from WPI for picking up the sample.

Team Survey won the Phase I competition this year and will take home $6,000 for that effort. ($5,000 from NASA and $1,000 from WPI.)

Team Mystic Lake, myself, did not do that well but I consider it a "building year", to borrow from sports team terminology. Mystic Two traveled the furthest distance of a robot in the SRR to date. It just kept trekking. I proved out much of my code and the ability of my very small rovers to handle the terrain.

SpacePride fielded two rovers but were unable to accomplish much. Their software developer dropped out near the end so they had to scramble to get something working via a state-machine in RoboRealm.

I will update the table if more information becomes available.

Just after I returned from the challenge, an email on a robotics mailing list asked for advice on languages to use for robots. Since I had almost all of the information posted above I put it into a reply and received a nice thanks in return. Hopefully someone will find this interesting.

(Updated UCSC - UC Santa Cruz from comment. Thanks.)
(Updated Middleman aka RoBear from comment. Thanks.)



Comments

Popular posts from this blog

Sensor - Accelerometer & Magnetics

Just as I was finishing my first look at the accelerometer and magnetic field sensors a couple of threads cropped up on the Android Developer's group:

http://groups.google.com/group/android-developers/browse_frm/thread/1b42c48ce47cb1c9/720c6f4f8a40fc67#720c6f4f8a40fc67

http://groups.google.com/group/android-developers/browse_frm/thread/2e14272d72b7ab4f#

I had the basic code working so dug a little deeper into the rotation routines and the timing. I posted responses on the threads but want here to dig into the details more.

First some observations applicable to my G1:

The sensors report approximetly every 20, 40 and 220 msec for FAST, GAME, and NORMAL.
A sample may be missed for a specific sensor but usually one of them will be generated - but sometimes all can be missed.
The magnetic field sensor is most reliable with only a few drops. The other sensors are dropped considerably more often.

A caveat in all this is the way I setup the sensor handling may make a difference. I have a singl…

Shifting Gears - iRobot Create

I'm shifting gears to robotics. Awhile ago I got an iRobot Create. Its basically a Roomba vacuum cleaner with the guts removed to make a cargo area. In this area is a 25-pin connector that provides power, TTL serial port, digital and analog I/O.

I also got a Command Module (CM) which fits onto the connector. The CM is an Atmega 168 processor that adds some additional I/O. It can be programmed to control the Create. I did so and basically reproduced the wandering behavior of the Create. It move around, bumps into things and turns away from what it hit. I added some additional behaviors such as if it got trapped, i.e. caught in the same place for a period of 10 secs, it would move to extract itself.

I want to do more with robots, such as entering in a RoboMagellan contest. That requires an outdoor capable robot that does a lot more than bump into things. A key component to me is vision. Maybe I could do that with the CM and another processor (like the CMUCam) but I really didn't …