Skip to main content

Team Waterloo Research Paper on SRR

Team Waterloo published about their work on a robot for the 2012 and 2013 NASA Sample Return Robot Centennial Challenges.
Mapping, Planning, and Sample Detection Strategies for Autonomous Exploration
This paper presents algorithmic advances and field trial results for autonomous exploration and proposes a solution to perform simultaneous localization and mapping (SLAM), complete coverage, and object detection without relying on GPS or magnetometer data. We demonstrate an integrated approach to the exploration problem, and we make specific contributions in terms of mapping, planning, and sample detection strategies that run in real-time on our custom platform. Field tests demonstrate reliable performance for each of these three main components of the system individually, and high-fidelity simulation based on recorded data playback demonstrates the viability of the complete solution as applied to the 2013 NASA Sample Return Robot Challenge.
 It is the Journal of Field Robotics in the Wiley Online Library. I could, and probably will, spend a lot of time with the back issues in the Journal.

I was blown away by the paper. Calling it a paper is misleading. It is 32 pages of details on mapping, planning and vision processing. In all my web searches I have not found anything that compares to the material. It is intensive and extensive.

Can you tell I liked it? It is just incredible. Reading it boosted my own ego for attempting the competition. Every one who competes in the SRR should give themselves a big smile of congratulations the next time they are in front of a mirror.

This work is not just abstract. It provided tangible results during the competition.

They were the first team to successfully collect the Phase 1 pre-cached sample and return it to the starting platform. This was in the second Phase 1 day of the 2013 SRR. The only thing the missed was stopping in time. The sample was just a little beyond the rear of the platform so it did not qualify. (Fortunately, in one sense, they are a demonstration team since they are Canadian. It would have been heartbreaking to miss by, in "honour" of Canada, centimeters and not collect the $5000 from NASA.) It is kind of amusing that for all the work reported in the paper they did not report on how the robot determined when to stop. 

The robot was guided by a suitcase on the Home Beacon. The suitcase was a clever idea since it added no additional weight to the transportation and they were going to have a number of them available. If the suitcase were positioned just a few centimeters closer to the platform the sample would have been properly positioned. 

Excellent work all-around by Team Waterloo.

Popular posts from this blog

Programming Languages Used for SRR

I asked at the SRR Challenge about the languages and vision processing used by each team. Here is what I found:

Team Language                       Vision Processing
Intrepid                         C++ / Matlab                                           Kuukulgur                     C++                                          OpenCV Mystic                           C++                                          RobotRealm SpacePride                    RoboRealm state machine          RoboRealm Survey                           C++, Python                             OpenCV Middleman                     LabView                                   LabView UCSC                           C/C++                                      OpenCV Waterloo                       C++, Python                                             WPI                              C++                                                              Wunderkammer             Python                                      ROS …

Sensor - Accelerometer & Magnetics

Just as I was finishing my first look at the accelerometer and magnetic field sensors a couple of threads cropped up on the Android Developer's group:

I had the basic code working so dug a little deeper into the rotation routines and the timing. I posted responses on the threads but want here to dig into the details more.

First some observations applicable to my G1:

The sensors report approximetly every 20, 40 and 220 msec for FAST, GAME, and NORMAL.
A sample may be missed for a specific sensor but usually one of them will be generated - but sometimes all can be missed.
The magnetic field sensor is most reliable with only a few drops. The other sensors are dropped considerably more often.

A caveat in all this is the way I setup the sensor handling may make a difference. I have a singl…

Cold Turkey on Linux

I bit the bullet a few weeks ago with Linux. I was getting ready to go to WPI for the SRR competition and decided to go cold turkey on my laptop. I put in a SSD and loaded Zorin Linux. It us recommended as a substitute for Win XP. One reason I liked it is the rolling upgrades instead of the Ubuntu staged upgrades.

There was still frustration. The WiFi did not work so I used the software updater to install the drivers it found from Broadcom. The OS would not boot after that. I reinstalled just before leaving and took the memory stick with the Zorin Live distro with me figuring I could always reload from it. I was impressed by the quickness of the installation. That encouraged me since if I messed up the laptop I could always quickly reinstall. I also had my iPad so accessing email, FB, and Twitter (I did a lot of tweeting with photos) were always available. 
I kept busy so it was not until Friday night up in VT to visit my sister that I had time to do much with the laptop. I cannot reca…