Skip to main content

Sensor - Accelerometer & Magnetics

Just as I was finishing my first look at the accelerometer and magnetic field sensors a couple of threads cropped up on the Android Developer's group:

I had the basic code working so dug a little deeper into the rotation routines and the timing. I posted responses on the threads but want here to dig into the details more.

First some observations applicable to my G1:

  • The sensors report approximetly every 20, 40 and 220 msec for FAST, GAME, and NORMAL.

  • A sample may be missed for a specific sensor but usually one of them will be generated - but sometimes all can be missed.

  • The magnetic field sensor is most reliable with only a few drops. The other sensors are dropped considerably more often.

A caveat in all this is the way I setup the sensor handling may make a difference. I have a single routine for onSensorChanged which handles all three sensors. It is possible that having three separate routines may produce different results.

One of the messages in the threads mentioned writing data to a file. I was concerned that writing to a file might cause delays in responding to the sensors. I collected my data by writing to the Log Cat. I then did a cut and paste to an editor, formatted the columns to CSV, and loaded the results into a spreadsheet for analysis.

Here is the code for capturing sensor information and peforming the rotations.

// ================================================================================================================
private class OrientationListner implements SensorEventListener {
final int matrix_size = 16;
float[] R = new float[matrix_size];
float[] outR = new float[matrix_size];
float[] I = new float[matrix_size];
float[] values = new float[3];
boolean isReady = false;

DigitalFilter[] filter =
{ new DigitalFilter(), new DigitalFilter(), new DigitalFilter(), new DigitalFilter(),
new DigitalFilter(), new DigitalFilter() };
private long lastMagsTime;
private long lastAccelsTime;
private long lastOrientsTime;

// ------------------------------------------------------------------------------------------------------------
public void onSensorChanged(SensorEvent s_ev) {
Sensor sensor = s_ev.sensor;

int type = sensor.getType();

switch (type) {
mags = s_ev.values;
isReady = true;
accels = s_ev.values;
orients = s_ev.values;
Exp.mText04.setText("" + (int) orients[0]);
Exp.mText05.setText("" + (int) orients[1]);
Exp.mText06.setText("" + (int) orients[2]);

if (mags != null && accels != null && isReady) {
isReady = false;

SensorManager.getRotationMatrix(R, I, accels, mags);

SensorManager.remapCoordinateSystem(R, SensorManager.AXIS_X, SensorManager.AXIS_Z, outR);
SensorManager.getOrientation(outR, values);
int[] v = new int[3];

v[0] = filter[0].average(values[0] * 100);
v[1] = filter[1].average(values[1] * 100);
v[2] = filter[2].average(values[2] * 100);

Exp.mText01.setText("" + v[0]);
Exp.mText02.setText("" + v[1]);
Exp.mText03.setText("" + v[2]);
// ----------------------------------------------------------------------------------------------------------------
public void onAccuracyChanged(Sensor sensor, int accuracy) {


I had a couple of requests for the DigitalFilter class. It is below although it is called DigitalAverage. I took it from a later version of the code where I changed the name to better indicate its actual operation. I originally callsed it 'Filter' because I thought I might get more complex than an a simple average.

No, I'm not going to explain how to integrate the two pieces of code. That is left as an exercise for the reader.

// ================================================================================================================
private class DigitalAverage {

final int history_len = 4;
double[] mLocHistory = new double[history_len];
int mLocPos = 0;

// ------------------------------------------------------------------------------------------------------------
int average(double d) {
float avg = 0;

mLocHistory[mLocPos] = d;

if (mLocPos > mLocHistory.length - 1) {
mLocPos = 0;
for (double h : mLocHistory) {
avg += h;
avg /= mLocHistory.length;

return (int) avg;


  1. Hi, I've been working on this for a wile and I have it working, but my readings are really the opposite to SMOOTH.

    I supose it's due to the DigitalFilter class used in the example but I'm not sure..

    Can you please share that class in order to test the full example?

    Thanks in advance


  2. Could you please explain what the DigitalFilter class is please.

    I have omitted this last section where you use filter[0].average etc

    and all I get back is zero values.


  3. The digital filter just takes an average of the last set of readings. I am using 4 reading presently but that is just a wild guess at how many to use. The tradeoff is that to few reading results in poor smoothing while to many makes the response to a move to sluggish.

  4. Hi! Thanks for the posts, they are very helpful! Could you please share the implementation details of that digital filter class?

  5. I added the digital filter class to the entry.

  6. Using a similar filter, I get bogus average values when the sensor orientation jumps from 180 to -180, which is actually the same angle.

    A proper way to average angles is:

    avg_in_radians = Math.atan2(sumY, sumX);

    where sumY is the sum of the sines, and sumX the sum of cosines.

  7. @Olivier - Thanks for the tip. I will try that. In landscape mode the compass direction _is_ twitchy when pointing south. I've done some looking into why but didn't consider the averaging as a source of the problem.

  8. No problem. By the way, do you understand why rotating slowly from portrait to landscape affects all three angles returned by a Sensor.TYPE_ORIENTATION event, and not only the roll angle (orients[2])?

  9. Hello, I just test your code and have some questions, what is the difference between the data obtained by the orientation sensor and the combination of magnetic sensor and accelerometer?, My question is as follows, what I wanted was to get the address to which you will find the phone (btw I am using a motorola Dext) with the orientation that I get the values for the variable degrees of azimuth, but seeing the magnetic sensor values vary widely but the phone is not moving .

    The worst excuse my English translation is not very good jeje

  10. @Oropher

    First, the combined sensors are supposed to be more accurate according to the documentation. Second, the combination sensors allow you to change the coordinate system with respect to the phone. I wanted the landscape coordinate system for my game with the compass bearing through the screen and out the back of the phone.

    I have a test program that puts both the orietnation and the combined sensor derieved orientation on the display at the same time. The value are comparable. So unless you want to change the coordinate system the orientation sensor seems sufficient.

    I have a G1 and the magnetic sensor is not completely stable but it is usable.

  11. Another question, the values returned by the combined sensors is in the same scale of the orientation sensor?, i did put the both values in the screen but these are not the same, there are too much variation, i think that this is probably because the sensor is not very well calibrated, but i´m not sure. i have to do more tests =P

  12. btw, thanks a lot for the quick answer =)

  13. This code is very helpfull, but I must admit I'm lost in the basic's.
    I'm trying to hock this upto a OpenGL camera for Augmented Reality (using - which is a really nice 3D engine).

    My questions are basically;

    The averaged v[x] values, what are they? degree's? what range?
    Which one is roll,pitch and yaw?

    Ive tried to work this out by blitting them to the screen, but I'm getting much confusion because no mater how carefull I rotate the phone in one axis *all* the values change by quite a lot :(

  14. Rud, thanks for explaining orientation sensing!

    I have done some further work based on your code. If you or others find it interesting, you can find the report here:

  15. @Markus - thanks for the citation in your discussion. I'll read through it carefully soon.

    @darkflame - Sorry for the delay in releasing your post.

    One general point is that I pulled the averaging from the routines. First it was confusing things and second, it really isn't good OO to have it in the measrurement class. Any averaging / filtering should be done with the values the class generates.

    What are the values - I don't remember and don't have time at the moment to look at the code. Sorry. My best memory is that they are radians.

  16. Rather old thread, but allow me a question, please. Doesn't the obtained bearing need to be corrected by the magnetic derivation value obtained by GeoField.getDeclination() before being used as "true" heading? Otherwise the heading is always relative to the magnetic north, isn't it?

  17. @neil,

    I think you would need to add in the magnetic deviation.

    I would suggest that this is a correction that should be applied outside of this class for good OO.

  18. Thanks, Rud, that's what I supposed. Thanks also for correcting the typo ;) Deviation of course. All in all this matches with the behavior of the iOS. There you also have to enable location services in order to obtain "true" heading instead of magnetic heading.
    Kind regards


Post a Comment

Popular posts from this blog

Programming Languages Used for SRR

I asked at the SRR Challenge about the languages and vision processing used by each team. Here is what I found:

Team Language                       Vision Processing
Intrepid                         C++ / Matlab                                           Kuukulgur                     C++                                          OpenCV Mystic                           C++                                          RobotRealm SpacePride                    RoboRealm state machine          RoboRealm Survey                           C++, Python                             OpenCV Middleman                     LabView                                   LabView UCSC                           C/C++                                      OpenCV Waterloo                       C++, Python                                             WPI                              C++                                                              Wunderkammer             Python                                      ROS …

Shifting Gears - iRobot Create

I'm shifting gears to robotics. Awhile ago I got an iRobot Create. Its basically a Roomba vacuum cleaner with the guts removed to make a cargo area. In this area is a 25-pin connector that provides power, TTL serial port, digital and analog I/O.

I also got a Command Module (CM) which fits onto the connector. The CM is an Atmega 168 processor that adds some additional I/O. It can be programmed to control the Create. I did so and basically reproduced the wandering behavior of the Create. It move around, bumps into things and turns away from what it hit. I added some additional behaviors such as if it got trapped, i.e. caught in the same place for a period of 10 secs, it would move to extract itself.

I want to do more with robots, such as entering in a RoboMagellan contest. That requires an outdoor capable robot that does a lot more than bump into things. A key component to me is vision. Maybe I could do that with the CM and another processor (like the CMUCam) but I really didn't …