20 July 2009

Sensor - Accelerometer & Magnetics II

A quick update on the code I presented in the previous post.

I'm not greatly experienced in Java having mainly worked with C++ for the last 15 plus years. I did have a company provided seminar on Java and have a half-dozen Java books on the shelf. But I'm still on the learning curve.

In the code I obtained the magnetice, acclerometer, and orientation sensor arrays by simply assigning their values to another array. I then used that array in subsequent calls to onSensorChanged. Not the proper way of doing this. I should have cloned the arrays which makes a copy. For example:


case Sensor.TYPE_MAGNETIC_FIELD:
mags = s_ev.values;
isReady = true;
break;


should be:

case Sensor.TYPE_MAGNETIC_FIELD:
mags = s_ev.values.clone();
isReady = true;
break;


I found this when I output all the values for the sensors to LogCat each time any one of them was updated. I noticed that the values for the accelerometer changed from its previous value when the orientation sensor was changed.

13 July 2009

Sensor - Accelerometer & Magnetics

Just as I was finishing my first look at the accelerometer and magnetic field sensors a couple of threads cropped up on the Android Developer's group:

http://groups.google.com/group/android-developers/browse_frm/thread/1b42c48ce47cb1c9/720c6f4f8a40fc67#720c6f4f8a40fc67

http://groups.google.com/group/android-developers/browse_frm/thread/2e14272d72b7ab4f#

I had the basic code working so dug a little deeper into the rotation routines and the timing. I posted responses on the threads but want here to dig into the details more.

First some observations applicable to my G1:

  • The sensors report approximetly every 20, 40 and 220 msec for FAST, GAME, and NORMAL.

  • A sample may be missed for a specific sensor but usually one of them will be generated - but sometimes all can be missed.

  • The magnetic field sensor is most reliable with only a few drops. The other sensors are dropped considerably more often.


A caveat in all this is the way I setup the sensor handling may make a difference. I have a single routine for onSensorChanged which handles all three sensors. It is possible that having three separate routines may produce different results.

One of the messages in the threads mentioned writing data to a file. I was concerned that writing to a file might cause delays in responding to the sensors. I collected my data by writing to the Log Cat. I then did a cut and paste to an editor, formatted the columns to CSV, and loaded the results into a spreadsheet for analysis.

Here is the code for capturing sensor information and peforming the rotations.




// ================================================================================================================
private class OrientationListner implements SensorEventListener {
final int matrix_size = 16;
float[] R = new float[matrix_size];
float[] outR = new float[matrix_size];
float[] I = new float[matrix_size];
float[] values = new float[3];
boolean isReady = false;

DigitalFilter[] filter =
{ new DigitalFilter(), new DigitalFilter(), new DigitalFilter(), new DigitalFilter(),
new DigitalFilter(), new DigitalFilter() };
private long lastMagsTime;
private long lastAccelsTime;
private long lastOrientsTime;

// ------------------------------------------------------------------------------------------------------------
public void onSensorChanged(SensorEvent s_ev) {
Sensor sensor = s_ev.sensor;

int type = sensor.getType();

switch (type) {
case Sensor.TYPE_MAGNETIC_FIELD:
mags = s_ev.values;
isReady = true;
break;
case Sensor.TYPE_ACCELEROMETER:
accels = s_ev.values;
break;
case Sensor.TYPE_ORIENTATION:
orients = s_ev.values;
Exp.mText04.setText("" + (int) orients[0]);
Exp.mText05.setText("" + (int) orients[1]);
Exp.mText06.setText("" + (int) orients[2]);
break;
}

if (mags != null && accels != null && isReady) {
isReady = false;

SensorManager.getRotationMatrix(R, I, accels, mags);

SensorManager.remapCoordinateSystem(R, SensorManager.AXIS_X, SensorManager.AXIS_Z, outR);
SensorManager.getOrientation(outR, values);
int[] v = new int[3];

v[0] = filter[0].average(values[0] * 100);
v[1] = filter[1].average(values[1] * 100);
v[2] = filter[2].average(values[2] * 100);

Exp.mText01.setText("" + v[0]);
Exp.mText02.setText("" + v[1]);
Exp.mText03.setText("" + v[2]);
}
}
// ----------------------------------------------------------------------------------------------------------------
public void onAccuracyChanged(Sensor sensor, int accuracy) {
}
}




Update

I had a couple of requests for the DigitalFilter class. It is below although it is called DigitalAverage. I took it from a later version of the code where I changed the name to better indicate its actual operation. I originally callsed it 'Filter' because I thought I might get more complex than an a simple average.

No, I'm not going to explain how to integrate the two pieces of code. That is left as an exercise for the reader.


// ================================================================================================================
private class DigitalAverage {

final int history_len = 4;
double[] mLocHistory = new double[history_len];
int mLocPos = 0;

// ------------------------------------------------------------------------------------------------------------
int average(double d) {
float avg = 0;

mLocHistory[mLocPos] = d;

mLocPos++;
if (mLocPos > mLocHistory.length - 1) {
mLocPos = 0;
}
for (double h : mLocHistory) {
avg += h;
}
avg /= mLocHistory.length;

return (int) avg;
}
}

08 July 2009

Differentiate Emulator from Device & Unique IDs

I am working on a game which uses various of the sensors. There is a bug in the Cupcake 1.5 emulator (at least through r2) which causes the Sensor Manager to hang. The advice from the Google Developer Groups moderators is to debug on devices since you can't get sensor information from the emulator. (But look into OI Intents which has a sensor emulator for the emulator. I have not yet looked into it.)

As a long time embedded systems developer this advice is absurd. There is all kinds of development work that can be accomplished without actual inputs, especially on a device with a GUI like the Android.

Additionally, it just isn't convenient reaching over to my G1 to see what it is doing and to manipulate it to check orientation changes, etc. Hitting ctrl-F11 to change orientation on the emulator is much easier.

I started wondering how to detect if the emulator or a device were running an application. This would allow skipping over the buggy Sensor Manager on the emulator to avoid the hang. Putting some conditional coding around sensor code might then allow development on the emulator.

My investigation found two sets of information to differentiate the device and emulator. This information would also allow differentiating among devices so has some use in all applications.

The first set of information is from the OS build. There are a number of fields available. All you need is the import and reading of the available fields. The comments after the code are from my emulator and the G1.

import android.os.Build;
// ...
Log.d(TAG, "config" + "\n " + Build.BOARD + "\n " + Build.BRAND + "\n" + Build.DEVICE + "\n" + Build.DISPLAY + "\n" + Build.FINGERPRINT + "\n" + Build.HOST + "\n" + Build.ID + "\n" + Build.MODEL + "\n" + Build.PRODUCT + "\n" + Build.TAGS + "\n" + Build.TIME + "\n" + Build.TYPE + "\n" + Build.USER);

// trout
// tmobile
// dream
// CRB43
// tmobile/kila/dream/trout:1.5/CRB43/148830:user/ota-rel-keys,release-keys
// undroid11.mtv.corp.google.com
// CRB43
// T-Mobile G1
// kila
// ota-rel-keys,test-keys
// 1242268990000
// user
// android-build


// unknown
// generic
// generic
// sdk-eng 1.5 CUPCAKE 148875 test-keys
// generic/sdk/generic/:1.5/CUPCAKE/148875:eng/test-keys
// e-honda.mtv.corp.google.com
// CUPCAKE
// sdk
// sdk
// test-keys
// 1242347389000
// eng
// android-build



The second differentiator is the Android ID which may be unique to each device. The emulator returns null while the device returns a string. Again, after the code is my results, although I did corrupt my G1's ID since with proper permissions the value can be changed.

Log.d(TAG, "and id " + Settings.Secure.getString(this.getContentResolver(), Settings.Secure.ANDROID_ID));

// G1: 200xxxxxd4cca5x
// Emulator: null




03 July 2009

Android Activity Analysis

I have been looking at the details of the life cycle of an Android Activity. There are many web sites that discuss this and many examples. But none of the example addressed all the onXXX() methods of Activity. I also found problems with some of the examples. In one case, the GUI update thread could be left running after the Activity supposedly was stopped.

I was trying to determine when the GUI update thread should be created, stopped, paused, etc that initiated this effort. I have a game underway (doesn't everyone?) and figuring out all the Activity infrastructure was giving me fits. Each example did it in a different manner and while there may be no one correct manner most of them seemed a little off. Usually they seemed to omit some of the steps needed to pause, stop, or destroy everything properly.

A number of discussions presented the life cycle as a state machine, as an alternative to the Google flow chart (or here). [I don't mean to single out Eric Burke of StuffThatHappens by using his chart to illustrate my point. He has good Android material on his site.] Like the examples, they didn't seem as well thought-out as possible. For example, notice in Eric's state chart how many places onResume() appears. That indicates to me that the representation of the state machine is needs more work.

I offer the chart below as a simpler representation which eliminates the redundant calls to onResume() and other routines. This diagram is simplified by the introduction of the Pause state reached by onStart() and onPause(). There may be another state after onCreate() and onRestart() are called but my work so far does not show it as important.


From Mystic Lake Software

The file accompaning this page, TemplateSurfaceView.zip, contains a skeleton application for an Android Activity using a SurfaceView as the drawing surface. The application doesn't do anything except report current state to LogCat.

In the TemplateSurvaceView (TSV), the Activity onXXX() routines are mimicked with corresponding doXXX() methods in the TSV class.

doStart() - create thread
doPause - pause thread
doResume - resume thread
doStop() - kill thread

It becomes clear that the onXXX() routines are nicely symmetrical with each pair being able to setup and tear down the parts needed for the application. The one misleading, oddball is onRestart(). At first thought it might somehow pair with doStart(), but this is not the case. The doStart() method is paired with doStop(). The doRestart90 appears to be available to duplicate some of the work done by onCreate() that may have been undone during onStop().

Once I had the diagram figured out the requirements became apparent and the TSV code was generally straightforward. Not so obvious was how to handle the Thread since many of the obvious Thread methods that would be used are deprecated due to deadlock problems. I won't go into the details here because this is a Java issue, not specific to Android. But read the article Why Are Thread.stop, Thread.suspend, Thread.resume and Runtime.runFinalizersOnExit Deprecated? for more information.

Introduction

Actually a place holder for an introduction. I have an Android article I want to post but don't want it as the first blog entry.

SRC2 - Explicit Steering - Wheel Speed

SRC2 Rover This fourth post about the  qualifying round of the NASA  Space Robotics Challenge - Phase 2  (SRC2) addresses t he speed of the ...