Skip to main content

Subsumption Architecture - Introduction II

The last post left off with me wondering why the Trapped behavior needed to be called even when a higher priority behavior was active. Here is why:

Remember I ported the code from the Command Module (CM) version to the Windows on the Fit PC Slim? In Windows there is a clock and in the CM there isn't. In the newer version I can setup a timeout by taking the current clock time and adding a delay. Then when the clock exceeds the timeout the timer has expired. For Trapped this meant the robot had not moved for over 10 seconds, despite other activities trying to make it move.

In the CM version, with no clock the Trapped needed to update the "clock" value to see if it exceeded the timeout value.

I'd say this falls under checking your assumptions. In the CM the assumption was that no clock existed so Trapped had to maintain its own. I didn't catch that the assumption was no longer valid in the Windows version.

I've been re-reading a lot of the papers on subsumption and the Jones book Robot Programming. I find that my implementation isn't really subsumption as it was originally presented by Rodney Brooks. Brooks is credited with the concept of programming robots by behaviors. His technique for implementing behaviors was subsumption.

My implementation is more in line with Jones, and also derives from David Anderson. David has an article about two of his robots, SR04 and jBot, which are accessible from the Dallas Personal Robotic Group .

I continue the discussion in a later post. I also want to take some time to reabsorb the subsumption and behavior programming concepts so I can elaborate on them.

Comments

Popular posts from this blog

Sensor - Accelerometer & Magnetics

Just as I was finishing my first look at the accelerometer and magnetic field sensors a couple of threads cropped up on the Android Developer's group:

http://groups.google.com/group/android-developers/browse_frm/thread/1b42c48ce47cb1c9/720c6f4f8a40fc67#720c6f4f8a40fc67

http://groups.google.com/group/android-developers/browse_frm/thread/2e14272d72b7ab4f#

I had the basic code working so dug a little deeper into the rotation routines and the timing. I posted responses on the threads but want here to dig into the details more.

First some observations applicable to my G1:

The sensors report approximetly every 20, 40 and 220 msec for FAST, GAME, and NORMAL.
A sample may be missed for a specific sensor but usually one of them will be generated - but sometimes all can be missed.
The magnetic field sensor is most reliable with only a few drops. The other sensors are dropped considerably more often.

A caveat in all this is the way I setup the sensor handling may make a difference. I have a singl…

Programming Languages Used for SRR

I asked at the SRR Challenge about the languages and vision processing used by each team. Here is what I found:

Team Language                       Vision Processing
Intrepid                         C++ / Matlab                                           Kuukulgur                     C++                                          OpenCV Mystic                           C++                                          RobotRealm SpacePride                    RoboRealm state machine          RoboRealm Survey                           C++, Python                             OpenCV Middleman                     LabView                                   LabView UCSC                           C/C++                                      OpenCV Waterloo                       C++, Python                                             WPI                              C++                                                              Wunderkammer             Python                                      ROS …

Shifting Gears - iRobot Create

I'm shifting gears to robotics. Awhile ago I got an iRobot Create. Its basically a Roomba vacuum cleaner with the guts removed to make a cargo area. In this area is a 25-pin connector that provides power, TTL serial port, digital and analog I/O.

I also got a Command Module (CM) which fits onto the connector. The CM is an Atmega 168 processor that adds some additional I/O. It can be programmed to control the Create. I did so and basically reproduced the wandering behavior of the Create. It move around, bumps into things and turns away from what it hit. I added some additional behaviors such as if it got trapped, i.e. caught in the same place for a period of 10 secs, it would move to extract itself.

I want to do more with robots, such as entering in a RoboMagellan contest. That requires an outdoor capable robot that does a lot more than bump into things. A key component to me is vision. Maybe I could do that with the CM and another processor (like the CMUCam) but I really didn't …