Skip to main content

Accelerating SRR Development While Gyrating Wildly

Decoding the title, I am experimenting with the Phidgets 1042 spatial sensor, also known as an Inertial Measurement Unit (IMU). This IMU contains an accelerometer, a gyroscope, and a compass. The compass is not allowed for competing in the SRR so it is being ignored.

I worked with this IMU for the 2013 SRR but could not get the results needed so I put it aside. Since the first of the year and getting more serious about the 2014 SRR, I began working with it more.

As I did last year, I began working with code I found that would fuse the accelerometer and gyroscope data into a single reading of the global pose of the robot. The results never came out correct. The main problem was the reading for bearing, primarily based on the gyroscope data, was inaccurate. I setup a servo to rotate the IMU through 90 degrees (or a fairly close approximation) but the results usually were less, somewhere in the mid-80 degree range.

After fussing with the code I decided to try a really basic test. First, some background information.

A gyroscope of this nature reports the angular change during a period of time. Specifically, this IMU reports degrees / second and the amount of time between readings. Multiplying the reading by the amount of time tells you the actual rotational movement for that period. Integrating those results provides the current angular position of the IMU. Thus:
\large \theta_{t} = \sum \omega_{t}dt \\
\theta_{t} \text{ is angular movement at time } t  \]
The test is simple. Setup the servo to rotate 90 degrees with the IMU flat. The z axis, up and down, of the gyro should have a rotation of 90 degrees following the equation above.

The first challenge for the test is determining the servo settings for a 90 degree rotation. The first pass was done just eye-balling the rotation of the IMU using the right angle of a drafting triangle. That was close enough to try. 

The resulting rotation was in the mid-80s. I was pretty sure the rotation was close to 90 but decided to make sure. 

Besides my eye ball, I was also using a sample program from Phidgets that reports all the information from the IMU. The program displays some nice graphics of the position of the IMU data on circles and in text form. The results for the gyroscope were better than mine but not totally correct. Why not use its code? It is in C# and doing some sensor fusion not a simple reporting of the gyro data. I could translate from C# to C++ but I would not have the simple accumulator I want. Still, it was reporting raw numbers using accumulation that were better than mine.

I wanted to eliminate the possibility that the actual rotation was not 90 degrees. For this I turned to the other sensor in the IMU, the accelerometer.

The accelerometer data points toward the center of the mass of the Earth. The convention is that the z-axis points up and down. The x- and y-axis point in the horizontal plane. The orientation I use for the IMU follows the right-hand rule. The thumb points up as the z-axis. The pointer finger is straight out for the y-axis. The middle finger points to the left for the x-axis. 

With this orientation the z-axis reads 1g, indicating the IMU is being pulled down with the force of 1 gravity. The other axis read 0.  A real-world sensor always reports some noise but it is easy to position the IMU so the horizontal readings are off by less than 0.010g. 

Lying flat the IMU does not provide any accelerometer information about rotation around the z-axis. (This is a fundamental problem with using only inertial information to determine and track bearing.) If you turn the unit up on its edge the accelerometer all three axis contribute information. Now a rotation around the z-axis is reflected in the values from the other axes.

If the z-axis is horizontal it reports 0 and the other axes report their angle from the vertical, downward pull of gravity. A rotation of 90 degrees can be determined using the measurements from one, or both, of these axes. 

My first thought was to start with the x-axis reading 1g and rotate it until it read 0. (The y-axis would be going from 0 to 1.) That should be a precise measurement of a 90 degree rotation.

The problem I had was obtaining a stable and repeatable 1g. There was a large range of servo values that provide a reading around 1g. I then recalled that accelerometer sensitivity is lowest when the accelerometer is aligned with the gravity vector. The most sensitive readings are when perpendicular to gravity, i.e around 0. One data sheet I checked showed a difference of two orders of magnitude with the overall sensitivity measured in milli-gs. That apparently was enough difference to make it difficult to position with the servo.

I switched to having the x-axis reading  0.707, which is 45 degrees. I then rotated it until it read -0.707, again 45 degrees to the vertical. This is a 90 degree rotation and the servo setting is more precise. 

With this all arranged I ran the test. The gyro z-axis was still reading less than 90 degrees when rotated.

But know I knew, as positively as possible, that the rotation was 90 degrees. I also had determined that a 1.02 millisecond change in the servo timing caused a 90 degree rotation. Knowing how to rotate the servo 90 degrees I could test with the IMU horizontal. 

I returned to my code and worked up from data acquisition to the calculations. The Phidgets code provides a callback mechanism that invokes a user routine when data is available. In that routine I set a semaphore event. The processing code waits on the event to read the data in a class member that I wrote last year. In my test program I used that member to read the data for processing. I forgot it waited for the event and put the same wait on the semaphore in my test code. When I removed second wait the gyro reading became 90 degrees. 

Why, though, did the double wait only occur intermittently? Most of the time, apparently, the wait in my test code had no effect. If it had, the result would have been considerably less since every other reading would have been lost. 

This is not one of my better performances. With some confidence that the gyro readings are correct, and being obtained properly, I can now work on the sensor fusion. 


Popular posts from this blog

Sensor - Accelerometer & Magnetics

Just as I was finishing my first look at the accelerometer and magnetic field sensors a couple of threads cropped up on the Android Developer's group:

I had the basic code working so dug a little deeper into the rotation routines and the timing. I posted responses on the threads but want here to dig into the details more.

First some observations applicable to my G1:

The sensors report approximetly every 20, 40 and 220 msec for FAST, GAME, and NORMAL.
A sample may be missed for a specific sensor but usually one of them will be generated - but sometimes all can be missed.
The magnetic field sensor is most reliable with only a few drops. The other sensors are dropped considerably more often.

A caveat in all this is the way I setup the sensor handling may make a difference. I have a singl…

Programming Languages Used for SRR

I asked at the SRR Challenge about the languages and vision processing used by each team. Here is what I found:

Team Language                       Vision Processing
Intrepid                         C++ / Matlab                                           Kuukulgur                     C++                                          OpenCV Mystic                           C++                                          RobotRealm SpacePride                    RoboRealm state machine          RoboRealm Survey                           C++, Python                             OpenCV Middleman                     LabView                                   LabView UCSC                           C/C++                                      OpenCV Waterloo                       C++, Python                                             WPI                              C++                                                              Wunderkammer             Python                                      ROS …

Shifting Gears - iRobot Create

I'm shifting gears to robotics. Awhile ago I got an iRobot Create. Its basically a Roomba vacuum cleaner with the guts removed to make a cargo area. In this area is a 25-pin connector that provides power, TTL serial port, digital and analog I/O.

I also got a Command Module (CM) which fits onto the connector. The CM is an Atmega 168 processor that adds some additional I/O. It can be programmed to control the Create. I did so and basically reproduced the wandering behavior of the Create. It move around, bumps into things and turns away from what it hit. I added some additional behaviors such as if it got trapped, i.e. caught in the same place for a period of 10 secs, it would move to extract itself.

I want to do more with robots, such as entering in a RoboMagellan contest. That requires an outdoor capable robot that does a lot more than bump into things. A key component to me is vision. Maybe I could do that with the CM and another processor (like the CMUCam) but I really didn't …