http://groups.google.com/group/android-developers/browse_frm/thread/1b42c48ce47cb1c9/720c6f4f8a40fc67#720c6f4f8a40fc67
http://groups.google.com/group/android-developers/browse_frm/thread/2e14272d72b7ab4f#
I had the basic code working so dug a little deeper into the rotation routines and the timing. I posted responses on the threads but want here to dig into the details more.
First some observations applicable to my G1:
- The sensors report approximetly every 20, 40 and 220 msec for FAST, GAME, and NORMAL.
- A sample may be missed for a specific sensor but usually one of them will be generated - but sometimes all can be missed.
- The magnetic field sensor is most reliable with only a few drops. The other sensors are dropped considerably more often.
A caveat in all this is the way I setup the sensor handling may make a difference. I have a single routine for onSensorChanged which handles all three sensors. It is possible that having three separate routines may produce different results.
One of the messages in the threads mentioned writing data to a file. I was concerned that writing to a file might cause delays in responding to the sensors. I collected my data by writing to the Log Cat. I then did a cut and paste to an editor, formatted the columns to CSV, and loaded the results into a spreadsheet for analysis.
Here is the code for capturing sensor information and peforming the rotations.
// ================================================================================================================
private class OrientationListner implements SensorEventListener {
final int matrix_size = 16;
float[] R = new float[matrix_size];
float[] outR = new float[matrix_size];
float[] I = new float[matrix_size];
float[] values = new float[3];
boolean isReady = false;
DigitalFilter[] filter =
{ new DigitalFilter(), new DigitalFilter(), new DigitalFilter(), new DigitalFilter(),
new DigitalFilter(), new DigitalFilter() };
private long lastMagsTime;
private long lastAccelsTime;
private long lastOrientsTime;
// ------------------------------------------------------------------------------------------------------------
public void onSensorChanged(SensorEvent s_ev) {
Sensor sensor = s_ev.sensor;
int type = sensor.getType();
switch (type) {
case Sensor.TYPE_MAGNETIC_FIELD:
mags = s_ev.values;
isReady = true;
break;
case Sensor.TYPE_ACCELEROMETER:
accels = s_ev.values;
break;
case Sensor.TYPE_ORIENTATION:
orients = s_ev.values;
Exp.mText04.setText("" + (int) orients[0]);
Exp.mText05.setText("" + (int) orients[1]);
Exp.mText06.setText("" + (int) orients[2]);
break;
}
if (mags != null && accels != null && isReady) {
isReady = false;
SensorManager.getRotationMatrix(R, I, accels, mags);
SensorManager.remapCoordinateSystem(R, SensorManager.AXIS_X, SensorManager.AXIS_Z, outR);
SensorManager.getOrientation(outR, values);
int[] v = new int[3];
v[0] = filter[0].average(values[0] * 100);
v[1] = filter[1].average(values[1] * 100);
v[2] = filter[2].average(values[2] * 100);
Exp.mText01.setText("" + v[0]);
Exp.mText02.setText("" + v[1]);
Exp.mText03.setText("" + v[2]);
}
}
// ----------------------------------------------------------------------------------------------------------------
public void onAccuracyChanged(Sensor sensor, int accuracy) {
}
}
Update
I had a couple of requests for the DigitalFilter class. It is below although it is called DigitalAverage. I took it from a later version of the code where I changed the name to better indicate its actual operation. I originally callsed it 'Filter' because I thought I might get more complex than an a simple average.
No, I'm not going to explain how to integrate the two pieces of code. That is left as an exercise for the reader.
// ================================================================================================================
private class DigitalAverage {
final int history_len = 4;
double[] mLocHistory = new double[history_len];
int mLocPos = 0;
// ------------------------------------------------------------------------------------------------------------
int average(double d) {
float avg = 0;
mLocHistory[mLocPos] = d;
mLocPos++;
if (mLocPos > mLocHistory.length - 1) {
mLocPos = 0;
}
for (double h : mLocHistory) {
avg += h;
}
avg /= mLocHistory.length;
return (int) avg;
}
}
Hi, I've been working on this for a wile and I have it working, but my readings are really the opposite to SMOOTH.
ReplyDeleteI supose it's due to the DigitalFilter class used in the example but I'm not sure..
Can you please share that class in order to test the full example?
Thanks in advance
David
Could you please explain what the DigitalFilter class is please.
ReplyDeleteI have omitted this last section where you use filter[0].average etc
and all I get back is zero values.
Thanks!
The digital filter just takes an average of the last set of readings. I am using 4 reading presently but that is just a wild guess at how many to use. The tradeoff is that to few reading results in poor smoothing while to many makes the response to a move to sluggish.
ReplyDeleteHi! Thanks for the posts, they are very helpful! Could you please share the implementation details of that digital filter class?
ReplyDeleteI added the digital filter class to the entry.
ReplyDeleteUsing a similar filter, I get bogus average values when the sensor orientation jumps from 180 to -180, which is actually the same angle.
ReplyDeleteA proper way to average angles is:
avg_in_radians = Math.atan2(sumY, sumX);
where sumY is the sum of the sines, and sumX the sum of cosines.
@Olivier - Thanks for the tip. I will try that. In landscape mode the compass direction _is_ twitchy when pointing south. I've done some looking into why but didn't consider the averaging as a source of the problem.
ReplyDeleteNo problem. By the way, do you understand why rotating slowly from portrait to landscape affects all three angles returned by a Sensor.TYPE_ORIENTATION event, and not only the roll angle (orients[2])?
ReplyDeleteHello, I just test your code and have some questions, what is the difference between the data obtained by the orientation sensor and the combination of magnetic sensor and accelerometer?, My question is as follows, what I wanted was to get the address to which you will find the phone (btw I am using a motorola Dext) with the orientation that I get the values for the variable degrees of azimuth, but seeing the magnetic sensor values vary widely but the phone is not moving .
ReplyDeleteThe worst excuse my English translation is not very good jeje
@Oropher
ReplyDeleteFirst, the combined sensors are supposed to be more accurate according to the documentation. Second, the combination sensors allow you to change the coordinate system with respect to the phone. I wanted the landscape coordinate system for my game with the compass bearing through the screen and out the back of the phone.
I have a test program that puts both the orietnation and the combined sensor derieved orientation on the display at the same time. The value are comparable. So unless you want to change the coordinate system the orientation sensor seems sufficient.
I have a G1 and the magnetic sensor is not completely stable but it is usable.
Another question, the values returned by the combined sensors is in the same scale of the orientation sensor?, i did put the both values in the screen but these are not the same, there are too much variation, i think that this is probably because the sensor is not very well calibrated, but i´m not sure. i have to do more tests =P
ReplyDeletebtw, thanks a lot for the quick answer =)
ReplyDeleteThis code is very helpfull, but I must admit I'm lost in the basic's.
ReplyDeleteI'm trying to hock this upto a OpenGL camera for Augmented Reality (using http://www.jpct.net - which is a really nice 3D engine).
My questions are basically;
The averaged v[x] values, what are they? degree's? what range?
Which one is roll,pitch and yaw?
Ive tried to work this out by blitting them to the screen, but I'm getting much confusion because no mater how carefull I rotate the phone in one axis *all* the values change by quite a lot :(
Rud, thanks for explaining orientation sensing!
ReplyDeleteI have done some further work based on your code. If you or others find it interesting, you can find the report here: http://photosaround.mandreasson.com/2010/05/technical-orientation-sensing-in-photos.html
@Markus - thanks for the citation in your discussion. I'll read through it carefully soon.
ReplyDelete@darkflame - Sorry for the delay in releasing your post.
One general point is that I pulled the averaging from the routines. First it was confusing things and second, it really isn't good OO to have it in the measrurement class. Any averaging / filtering should be done with the values the class generates.
What are the values - I don't remember and don't have time at the moment to look at the code. Sorry. My best memory is that they are radians.
Rather old thread, but allow me a question, please. Doesn't the obtained bearing need to be corrected by the magnetic derivation value obtained by GeoField.getDeclination() before being used as "true" heading? Otherwise the heading is always relative to the magnetic north, isn't it?
ReplyDeleteRegards
@neil,
ReplyDeleteI think you would need to add in the magnetic deviation.
I would suggest that this is a correction that should be applied outside of this class for good OO.
Thanks, Rud, that's what I supposed. Thanks also for correcting the typo ;) Deviation of course. All in all this matches with the behavior of the iOS. There you also have to enable location services in order to obtain "true" heading instead of magnetic heading.
ReplyDeleteKind regards