Skip to main content

Robot Components

Time to explain the components of the robot a bit more. The diagram provides an overview.

The main platform is the iRobot Create. It is an autonmous robot by itself but provides control through a serial port connection using a protocol called the Open Interface (OI). The OI can read the sensors and control the actuators of the Create.

The Fit PC Slim is a compact, low power PC with 3 USB ports and a Wifi, plus the usual PC components. It is powered from the Create through a voltage regulator on the Interface Board (IB). The IB also carries the USB interfaces for the serial port and I2C.

I2C is a standard 2 wire bus for controlling actuators and accessing sensor input. I'm not totally sure what is going to be on the bus. I expect a compass module, at least, to provide orientation. I have sonar and IR distance sensors working on I2C but am not sure which to use. These would be backup for detecting obstacles via vision processing. A main goal is for the robot to move around without bumping into obstacles. I also have a digital I/O board that could be used to provide LED indicators of what the robot is doing.

The reasons for the Wifi on the Slim is to download software and allow monitoring from the desktop or laptop, especially in the field.

RoboRealm (RR)is a software package whose main purpose is vision processing. It also has a lot of robot control capability, including a plug-in for the Create. I decided not to use that plug-in after some issues figuring out exactly how it worked. That may have been a mistake. My other concern was the latency of getting sensor information with it getting collected by RR and then collected from RR by the control program. RR will be used to handle the camera and vision processing.

Comments

Popular posts from this blog

Sensor - Accelerometer & Magnetics

Just as I was finishing my first look at the accelerometer and magnetic field sensors a couple of threads cropped up on the Android Developer's group:

http://groups.google.com/group/android-developers/browse_frm/thread/1b42c48ce47cb1c9/720c6f4f8a40fc67#720c6f4f8a40fc67

http://groups.google.com/group/android-developers/browse_frm/thread/2e14272d72b7ab4f#

I had the basic code working so dug a little deeper into the rotation routines and the timing. I posted responses on the threads but want here to dig into the details more.

First some observations applicable to my G1:

The sensors report approximetly every 20, 40 and 220 msec for FAST, GAME, and NORMAL.
A sample may be missed for a specific sensor but usually one of them will be generated - but sometimes all can be missed.
The magnetic field sensor is most reliable with only a few drops. The other sensors are dropped considerably more often.

A caveat in all this is the way I setup the sensor handling may make a difference. I have a singl…

Programming Languages Used for SRR

I asked at the SRR Challenge about the languages and vision processing used by each team. Here is what I found:

Team Language                       Vision Processing
Intrepid                         C++ / Matlab                                           Kuukulgur                     C++                                          OpenCV Mystic                           C++                                          RobotRealm SpacePride                    RoboRealm state machine          RoboRealm Survey                           C++, Python                             OpenCV Middleman                     LabView                                   LabView UCSC                           C/C++                                      OpenCV Waterloo                       C++, Python                                             WPI                              C++                                                              Wunderkammer             Python                                      ROS …

Shifting Gears - iRobot Create

I'm shifting gears to robotics. Awhile ago I got an iRobot Create. Its basically a Roomba vacuum cleaner with the guts removed to make a cargo area. In this area is a 25-pin connector that provides power, TTL serial port, digital and analog I/O.

I also got a Command Module (CM) which fits onto the connector. The CM is an Atmega 168 processor that adds some additional I/O. It can be programmed to control the Create. I did so and basically reproduced the wandering behavior of the Create. It move around, bumps into things and turns away from what it hit. I added some additional behaviors such as if it got trapped, i.e. caught in the same place for a period of 10 secs, it would move to extract itself.

I want to do more with robots, such as entering in a RoboMagellan contest. That requires an outdoor capable robot that does a lot more than bump into things. A key component to me is vision. Maybe I could do that with the CM and another processor (like the CMUCam) but I really didn't …