Using the accelerometer to control planar transforms on Windows Phone 7

Lately I’ve been looking into some Augmented Reality uses on Windows Phone 7, and one of the first things you need to do for this, is to use the sensors to control what’s displayed on the phone screen.

So the first step for me was to better understand how the accelerometer interacted with the phone’s orientation. I wanted to create a simple “3D Plane” that was always looked like it was “level”. This could be accomplished with a Rectangle and a PlaneTransform. The only question is: How do I transform the accelerometer values to the RotationX/Y/Z rotation angles on the PlaneTransform?

To make matters worse, depending on whether your phone is currently in Portrait, LandscapeLeft or LandscapeRight mode, the screen coordinate system, and thus the transforms changes, even though the accelerometer report the values independent of screen orientation. So to get a property transform, we also need to know what way the screen coordinate system is oriented relative to the accelerometer coordinate system.

With a little trigonometry it wasn’t too hard to figure out, so here’s the code to perform this little app.

First the border we used as the “level” plane:

<Border x:Name="plane" BorderBrush="Green" BorderThickness="2"
        Width="350" Height="350" >
    <Border.Projection>
        <PlaneProjection x:Name="proj"  />
    </Border.Projection>
</Border>

Next, create an accelerometer instance, and call “Update” every time we get a reading:

var meter = new Accelerometer();
meter.ReadingChanged += meter_ReadingChanged;
...
private void meter_ReadingChanged(object sender, AccelerometerReadingEventArgs e)
{
    Dispatcher.BeginInvoke(() =>
    {
        UpdateRotation(e.X,e.Y.e.Z);
    });
}

Lastly, update the PlaneProjection parameters, based on the rotation (this is where the “real meat” is):

private void UpdateRotation(double x, double y, double z)
{
    var offset = 0d;
    //Adjust for screen orientation when in landscape mode
    //PortraitUp mode doesn't need an offset
    if (Orientation == PageOrientation.LandscapeLeft) offset = .5;
    else if (Orientation == PageOrientation.LandscapeRight) offset = 1.5;                

    var rx = (Math.Acos(z) / Math.PI + 1) * 180;
    var rz = (offset - Math.Atan2(x, y) / Math.PI + 1) * 180;
    if(!double.IsNaN(rz)) proj.RotationZ = rz;
    if(!double.IsNaN(rx)) proj.RotationX = rx;
}

And lastly, here’s what this looks like (note how the screen orientation flips when the phone is rotated, but the plane stays in the same place):

Note that this sample doesn’t deal with calibration. I suggest you look at this blogpost on how to do that. It also includes a lot of information on the accelerometer in general. That blog also discusses filtering the noisy data which is important to give a smooth looking result. The sample code you can download below, includes a simple low-pass filter to make it feel a lot smoother.

Download source code

Lastly here’s an example/preview of a little Augmented Reality app I’ve been working on that is using this approach to overlay the camera feed with azimuth values:

ESRI DevSummit 2011–over'n'out

The ESRI Developer Summit is over, and I had quite a blast. We got tons of great feedback and tough questions from all the developers attending. Thank you for that! We also had some well-attended sessions on Windows Phone, Silverlight and WPF. When these recordings goes live, I’ll update this blog-post.

For the first time, I even got to go on stage during plenary and make a fool of myself. You might already have read about my ventures into programming with a Kinect, but I got to show a far more polished version during the plenary session, and luckily everything worked out great. This version (although not demoed due to time constraints) actually supports multiple users interacting with the map at the same time (at one point we had 4 different people drawing on the map all simultaneously). For those interested, this was based on OpenNI drivers and the ArcGIS API for WPF, which is a different approach than I originally blogged about. Here’s a recording of my 2 minutes of “fame” Smile:

Me on stage making a fool of myself

I have to admit I was a little nervous this would break down during the demo, since this was all based on stuff that wasn’t even in beta software yet. We later had it running at our showcase area 24/7 and I didn’t have to restart the app for almost 3 days running with lots of users playing with it, so perhaps it was running more solid than I had feared Smile - So go play with those OpenNI drivers! They might be called “unstable” but I found them to be very stable.

My little A-to-B app also go a nice mention during the plenary. I wrote this to learn some more phone application practices, exercise the phone API we’ve been working on for a while and simply to have some coding fun. If you have a Windows Phone 7, go download it now, and if not, here’s Dave giving a quick tour of the app.

A-to-B WinPhone7 Application demoed

Thank you to every who attended and especially those who gave us feedback.