Skip to main content

Accelerating SRR Development While Gyrating Wildly

Decoding the title, I am experimenting with the Phidgets 1042 spatial sensor, also known as an Inertial Measurement Unit (IMU). This IMU contains an accelerometer, a gyroscope, and a compass. The compass is not allowed for competing in the SRR so it is being ignored.

I worked with this IMU for the 2013 SRR but could not get the results needed so I put it aside. Since the first of the year and getting more serious about the 2014 SRR, I began working with it more.

As I did last year, I began working with code I found that would fuse the accelerometer and gyroscope data into a single reading of the global pose of the robot. The results never came out correct. The main problem was the reading for bearing, primarily based on the gyroscope data, was inaccurate. I setup a servo to rotate the IMU through 90 degrees (or a fairly close approximation) but the results usually were less, somewhere in the mid-80 degree range.

After fussing with the code I decided to try a really basic test. First, some background information.

A gyroscope of this nature reports the angular change during a period of time. Specifically, this IMU reports degrees / second and the amount of time between readings. Multiplying the reading by the amount of time tells you the actual rotational movement for that period. Integrating those results provides the current angular position of the IMU. Thus:
\large \theta_{t} = \sum \omega_{t}dt \\
\theta_{t} \text{ is angular movement at time } t  \]
The test is simple. Setup the servo to rotate 90 degrees with the IMU flat. The z axis, up and down, of the gyro should have a rotation of 90 degrees following the equation above.

The first challenge for the test is determining the servo settings for a 90 degree rotation. The first pass was done just eye-balling the rotation of the IMU using the right angle of a drafting triangle. That was close enough to try. 

The resulting rotation was in the mid-80s. I was pretty sure the rotation was close to 90 but decided to make sure. 

Besides my eye ball, I was also using a sample program from Phidgets that reports all the information from the IMU. The program displays some nice graphics of the position of the IMU data on circles and in text form. The results for the gyroscope were better than mine but not totally correct. Why not use its code? It is in C# and doing some sensor fusion not a simple reporting of the gyro data. I could translate from C# to C++ but I would not have the simple accumulator I want. Still, it was reporting raw numbers using accumulation that were better than mine.

I wanted to eliminate the possibility that the actual rotation was not 90 degrees. For this I turned to the other sensor in the IMU, the accelerometer.

The accelerometer data points toward the center of the mass of the Earth. The convention is that the z-axis points up and down. The x- and y-axis point in the horizontal plane. The orientation I use for the IMU follows the right-hand rule. The thumb points up as the z-axis. The pointer finger is straight out for the y-axis. The middle finger points to the left for the x-axis. 

With this orientation the z-axis reads 1g, indicating the IMU is being pulled down with the force of 1 gravity. The other axis read 0.  A real-world sensor always reports some noise but it is easy to position the IMU so the horizontal readings are off by less than 0.010g. 

Lying flat the IMU does not provide any accelerometer information about rotation around the z-axis. (This is a fundamental problem with using only inertial information to determine and track bearing.) If you turn the unit up on its edge the accelerometer all three axis contribute information. Now a rotation around the z-axis is reflected in the values from the other axes.

If the z-axis is horizontal it reports 0 and the other axes report their angle from the vertical, downward pull of gravity. A rotation of 90 degrees can be determined using the measurements from one, or both, of these axes. 

My first thought was to start with the x-axis reading 1g and rotate it until it read 0. (The y-axis would be going from 0 to 1.) That should be a precise measurement of a 90 degree rotation.

The problem I had was obtaining a stable and repeatable 1g. There was a large range of servo values that provide a reading around 1g. I then recalled that accelerometer sensitivity is lowest when the accelerometer is aligned with the gravity vector. The most sensitive readings are when perpendicular to gravity, i.e around 0. One data sheet I checked showed a difference of two orders of magnitude with the overall sensitivity measured in milli-gs. That apparently was enough difference to make it difficult to position with the servo.

I switched to having the x-axis reading  0.707, which is 45 degrees. I then rotated it until it read -0.707, again 45 degrees to the vertical. This is a 90 degree rotation and the servo setting is more precise. 

With this all arranged I ran the test. The gyro z-axis was still reading less than 90 degrees when rotated.

But know I knew, as positively as possible, that the rotation was 90 degrees. I also had determined that a 1.02 millisecond change in the servo timing caused a 90 degree rotation. Knowing how to rotate the servo 90 degrees I could test with the IMU horizontal. 

I returned to my code and worked up from data acquisition to the calculations. The Phidgets code provides a callback mechanism that invokes a user routine when data is available. In that routine I set a semaphore event. The processing code waits on the event to read the data in a class member that I wrote last year. In my test program I used that member to read the data for processing. I forgot it waited for the event and put the same wait on the semaphore in my test code. When I removed second wait the gyro reading became 90 degrees. 

Why, though, did the double wait only occur intermittently? Most of the time, apparently, the wait in my test code had no effect. If it had, the result would have been considerably less since every other reading would have been lost. 

This is not one of my better performances. With some confidence that the gyro readings are correct, and being obtained properly, I can now work on the sensor fusion. 

Popular posts from this blog

Cold Turkey on Linux

I bit the bullet a few weeks ago with Linux. I was getting ready to go to WPI for the SRR competition and decided to go cold turkey on my laptop. I put in a SSD and loaded Zorin Linux. It us recommended as a substitute for Win XP. One reason I liked it is the rolling upgrades instead of the Ubuntu staged upgrades.

There was still frustration. The WiFi did not work so I used the software updater to install the drivers it found from Broadcom. The OS would not boot after that. I reinstalled just before leaving and took the memory stick with the Zorin Live distro with me figuring I could always reload from it. I was impressed by the quickness of the installation. That encouraged me since if I messed up the laptop I could always quickly reinstall. I also had my iPad so accessing email, FB, and Twitter (I did a lot of tweeting with photos) were always available. 
I kept busy so it was not until Friday night up in VT to visit my sister that I had time to do much with the laptop. I cannot reca…

Sensor - Accelerometer & Magnetics

Just as I was finishing my first look at the accelerometer and magnetic field sensors a couple of threads cropped up on the Android Developer's group:

I had the basic code working so dug a little deeper into the rotation routines and the timing. I posted responses on the threads but want here to dig into the details more.

First some observations applicable to my G1:

The sensors report approximetly every 20, 40 and 220 msec for FAST, GAME, and NORMAL.
A sample may be missed for a specific sensor but usually one of them will be generated - but sometimes all can be missed.
The magnetic field sensor is most reliable with only a few drops. The other sensors are dropped considerably more often.

A caveat in all this is the way I setup the sensor handling may make a difference. I have a singl…

The Autonomous Roboticist

Since September 2016 I've been competing in the NASA Space Robotics Centennial Challenge (SRC). The challenge had a qualifying period and the final competition. I was one of the twenty teams from an international pool who qualified for the final competition. In mid-June the competitors ran their entries on a simulation in the cloud. The last few days, June 28 -30th capped the competition with a celebration at Space Center Houston, an education and entertainment facility next to the NASA Johnson Space Center.

On Thursday, the 29th, teams were invited to give presentations to the other teams, the NASA people who organized the challenge, and others. I used the opportunity to speak about my approach to the competition but also to raise the question of how an amateur roboticist, like myself, can make a meaningful contribution to robotics. 
Two ways are through competitions like this and by contributing software to the Robot Operating System (ROS). There aren't always competitions …