25 January 2010

Create - Initialization Processing

Developing software is as much a research project as an engineering process. In part this is because a developer is not usually a domain knowledge expert, say, an accountant. The developer thus has to learn a fair amount about the domain in order to proceed. Still, some of the learning occurs as the project proceeds. You have to learn what you don't know.

First lesson is that the Create doesn't provide unswitched power that is sufficient to run the Fit PC Slim. If the Create is turned on there is sufficient power. This means the Slim can't turn the Create power off because the Slim loses power, also.

Ideally the Create can stay on all the time. It has a docking stations - its home base - for recharging. The built in processing can find the base and run onto it to charge, or my software could replicate that processing.

First minor glitch is that the Create stops responding to the Slim commands when it docks. It turns out there is an undocumented soft reset command (a '7') that puts it back into a mode where it will accept commands. Once back into this mode the Slim can monitor the charging process and determine when it is safe to leave the dock.

When the Create is charging is a good time for doing work on the interface board or the Slim. What happens when you reconnect everything onto a charging Create?

The (re)learned lesson is that startup and shutdown processing are often the most challenging parts of a software project. Once everything is up and running a software process is usually straightforward, albeit with a lot of details to chase. You just nibble away at them one at a time until they are all resolved. Then the process is just doing the same thing over and over again.

Startup is a big discontinuity. What was the robot doing before it shutdown? What has changed since then? What is the current state now?

This all was triggered when I realized the robot had to determine whether it was charging or in the wild when it started up. It takes different actions depending on where it is. If docked it continues until charged, backs off the dock, and switches to "wild" mode of operation.

Okay, who said, "Rud, as an experienced developer you should have dealt with this already." Guilty as charged but this has been a casual, hobby project up until now. This was my wakeup to start applying a more formal approach and thinking through some of these issues. I'm still not going to go fully formal since I'm more interested in having fun. So don't expect a 6 week hiatus while I produce a formal analysis and design. I am going to do some of my thinking in blog posts, using it for my documentation.

Android - Not Abandoned

I'm not abandoning the Android work. I still want to get Galactic Guardian on the market in both a lite, free version, and a paid version. I'll just bounce between the robot and android projects as the spirit moves me.

The report from the ADC2 was that 50% of the testers in phase one liked the game so it seems well worth the effort to submit it to the market.

Shifting Gears - iRobot Create

I'm shifting gears to robotics. Awhile ago I got an iRobot Create. Its basically a Roomba vacuum cleaner with the guts removed to make a cargo area. In this area is a 25-pin connector that provides power, TTL serial port, digital and analog I/O.

I also got a Command Module (CM) which fits onto the connector. The CM is an Atmega 168 processor that adds some additional I/O. It can be programmed to control the Create. I did so and basically reproduced the wandering behavior of the Create. It move around, bumps into things and turns away from what it hit. I added some additional behaviors such as if it got trapped, i.e. caught in the same place for a period of 10 secs, it would move to extract itself.

I want to do more with robots, such as entering in a RoboMagellan contest. That requires an outdoor capable robot that does a lot more than bump into things. A key component to me is vision. Maybe I could do that with the CM and another processor (like the CMUCam) but I really didn't want to learn YAPE (yet another programming environment).

Around the time I got thinking seriously on this I looked at ITX boards. Then the Fit PC computers became available, specifically the Fit PC Slim. The PC form and wireless sold me on trying to us it. The one drawback might be the processor speed when trying to do vision processing. That is acceptable because the Create with the Slim is a testbed for RoboMagellan where an entirely new, slightly larger platform will be used. By going with the PC as the base there are a large number of possibilities, including laptops and netbooks. If the processor is slow for vision the Create simply won't move as quickly or smoothly.

I have the Slim hooked up to the Create, drawing power, and running most of the behaviors previoiusly implemented with the CM. Once I got the basic threading, serial communications, and Create interface working the behaviors started working within minutes since they ported easily from the CM versions. All the code is C++. Threading and serial port routines are all from previous projects so its all come together with a few days work.

19 January 2010

Subsumption Architecture - Introduction

The brain of a robot is the software. The software has to take in the sensor data, interpret it, and generate commands to the actuators. One architecture for robot software is called subsumption. It came out of MIT and Prof. Rodney Brooks who is a founder of iRobot who makes my Create robot. The idea is to break the robotos activities into small pieces. Let me build up the concept by example. A fundamental activity of a robot is to cruise around. If nothing is to be done just let the robot drive straight ahead. So we create an activity called Cruise. It simply actuates the motors to go straight at a safe speed. It is easy to write and test. After driving straight ahead for awhile the robot bumps into something. And continues to try to go straight ahead. This is not good for the robot or the cat or furniture it bumped into. So we write a Bump activity using the sensors on the robot - the Create has a right and a left bump sensor on the front that wrap around a bit toward the sides. So Bump determines if a bump sensor was triggered the robot should stop. So we write Bump. How to Bump and Drive get put together in the software? This is the subsumption architecture part. Initially its easy. First call Bump. If it doesn't activate, call Drive. If Bump does activate, don't call Drive. Let this run. The robot goes merrily off, bumps into a cat, and stops. The cat gets up, the robot continues straight ahead, hits a chair, stops, and stops and stops and stops. Not very interesting. What we'd like is for the robot to back up a little bit, turn, and then continue straight ahead. Hopefully that will clear the obstacle, and it will if the robot just brushed a wall. But even if it doesn't, repeating that behavior will eventually turn the robot toward an open area. (Well, many times it will. More later...) This new behavior is somewhat different from Bump and Drive because we want it to do the back and turn without interruption. The term used for this is a ballistic behavior. Do we add this to Bump or Drive, or create a new behavior? The texts I've read added it to Bump. But based on my experience I create a new behavior called Flee. Flee works with what could be called an internal sensor. This internal sensor tells Flee thow much to back up and turn. So Bump sets this internal sensor to backup a little bit (40 mm) and turn a little (20 degrees). Since Bump can tell whether the left or right bump sensor (or both) was hit it also sets the direction of the turn so the robot will turn away from the bump. Now the activities are called in the order: Flee, Bump, Drive. Remember that if Flee is active the later activites aren't called. If Bump is active, Drive is not called. So the robot Drives ahead, Bumps into something, the internal flee sensor is set, and Flee backs up and turns the robot. Then with both Flee and Bump inactive, Drive engages and the robot moves ahead. Just for completeness, I have another activity called Trapped. It is added between Flee and Bump. Every time Trapped is called it records the time and distance moved. If the robot has not moved very far (80 mm) in a certain period of time (10 seconds) then Trapped sets the flee sensor to back a little bit and turn 180 degrees. The idea is that by turning 180 degrees the robot can get out of a bad situation. One such situation is the legs on a rolling desk chair, or a corner. With these behaviors my Create wanders around the house pretty well. The actual implementation needs a couple more details. Here is some psuedo code: preempt = false; preempt = Flee(preempt); preempt = Trapped(preempt); preempt = Bump(preempt); preempt = Drive(preempt); The variable preempt is set to true by an activity if it is active. If Bump sense a bump then it sets preempt. When Drive sees preempt is set it does not activate, i.e. does not drive forward. If Bump sees preempt is set it does not bother checking the bump sensors, because presumably Flee is now handling the bump condition. Why bother calling activities if they are preempted? Look back at how Trapped works. It is monitoring the distance traveled. If it is not called because Flee is active... And there I'm going to let it hang because I don't remember why. But I'm going to publish this now, leave it as is, and resume in another posting. This is software development as it is, folks. There are a few possibilities here:
  • I simply don't recall the reason so have to remember it or rethink it. That is why you should document things.
  • There was a valid reason that is no longer valid. Boy, that happens all the time in development. A good habit to develop is to revist assumpts regularly to see how they've changed.
  • I simply blew it when writing the code many months ago.
  • ...or some totally different situation that I can't think of right now.

That is the basics of subsumption, though. A good book on robot programming that covers subsumption is Robot Programming - A Practical Guide to Behavior-Based Robotics" by Joseph L. Jones.

...sine die

03 September 2009

ADC2 - Package Name

One of the last minute requirements for submission to ADC2 was that the Java package name be different from the name used in the Android Market. My package originally was com.mysticlake.galacticguardian.

As you know I'm not a Java guru so I wondered what I could do that would be easy. I finally changed the 'com' to 'adc2'. Eclipse readily made the change and all was good.

Except for version control. I use Subversion. The repository lives on my laptop computer. (Then I always have everything with me.) Subversion did not care for changing the top directory name. For all my years of development experience I've not worked with version control systems except in the most rudimentary fashion. Guess I need to figure out branching and merging.

Game Submitted to ADC2 - Galactic Guardian: Zap GPS

I finished my game and submitted it 6 hours before the deadline of 31 Aug 11:59:59 PT. No death march to the last minute here!

It was a bit of a crunch because a week ago my throat started feeling gluncky and by Wed I knew I was not well. (Or at least less well than usual.) Today is the first day I feel okay. But I persevered. My wife, Shari was really supportive taking over the cooking and a couple other household tasks I usually perform. I am retired but she still works as a college professor. Fortunately, it was the first week of classes so she didn't have much grading or other work to do. Not that cooking was a big deal because all I wanted were BLTs and cereal most of the time. We did manage a Crosby, Stills and Nash concert last Friday evening which was a good time away from everything.

You can see the material that appears in the game Info at Mystic Lake Software. I need to do more with the web site but wanted to get that page up quickly. I'll be using the web page for describing the game for users and, possibly, some of my thinking about the game itself. In this blog I'll discuss the technical development. There is bound to be some overlap but I'll keep that to a minimum.

The game came together pretty well. I went to the doctor Thursday morning, annual physical scheduled a few weeks ago, and took the Android with me. While waiting in the examining room I played the game and found it more interesting than I had realized. Of course all the time I had played it previously I was watching more for problems or if something I'd just implemented worked. That was the first time I'd just played with it.

The weekend before my granddaughter Meg had her 10th birthday. I showed her the game as it was then and she found it interesting. She went over to my mother and described it to her quite excitedly. That was encouraging.

I doubt that the game will have enough pizazz for the developers challenge but you never know. If I didn't enter it I surely couldn't win so may as well try. Plus it gave me a deadline to work toward and use as an explanation to others about why I was busy.

Next task is to make some minor changes and submit it to the Android market so it will be direclty accessible to all, not just those who will judge. Then on to making some improvements...and maybe make it a paid for application.

20 July 2009

Sensor - Accelerometer & Magnetics II

A quick update on the code I presented in the previous post.

I'm not greatly experienced in Java having mainly worked with C++ for the last 15 plus years. I did have a company provided seminar on Java and have a half-dozen Java books on the shelf. But I'm still on the learning curve.

In the code I obtained the magnetice, acclerometer, and orientation sensor arrays by simply assigning their values to another array. I then used that array in subsequent calls to onSensorChanged. Not the proper way of doing this. I should have cloned the arrays which makes a copy. For example:


case Sensor.TYPE_MAGNETIC_FIELD:
mags = s_ev.values;
isReady = true;
break;


should be:

case Sensor.TYPE_MAGNETIC_FIELD:
mags = s_ev.values.clone();
isReady = true;
break;


I found this when I output all the values for the sensors to LogCat each time any one of them was updated. I noticed that the values for the accelerometer changed from its previous value when the orientation sensor was changed.

SRC2 - Explicit Steering - Wheel Speed

SRC2 Rover This fourth post about the  qualifying round of the NASA  Space Robotics Challenge - Phase 2  (SRC2) addresses t he speed of the ...