This page is currently out of date. Please refer to this talk (PPT) for more recent information.

3X Inertial Measurement Unit Project


The 3X embedded in a bread roll for (void*).


Overview

This project examines the uses of inertial measurement units for gesture sensing and user feedback applications.  The overall goal is to create a class of devices that have some sense of their own movement and can give feedback to a user given certain patterns.  Imagine shoes that beep when you are overrotating, golf clubs that yell "Slice!" when you are about to slice, or even juggling balls that can teach you how to juggle.

To reach this goal, we require not only sensors but a framework for interpreting their data and simple effective feedback channels. The work to date has been to design and build a compact inertial measurement unit (see above) as a development platform.  My thesis work will concentrate on creating a general framework for manipulating data, recognizing gestures, and providing feedback, with the goal of implementing Seymour Papert's vision of a set of juggling balls that can teach you how to juggle. Doing this processing on board the device as opposed to on a desktop computer is a goal for future work.
 

Motivation

This project builds on top of the Expressive Footwear project, where we instrumented a shoe with a number of inertial and pressure sensors (among others).  We have reduced the sensor set to accelerometers and gyroscopes, and have expanded it to three dimensions, allowing us to measure orientation with very good accuracy and position with reasonable accuracy in three space. The system is wireless to allow it to be used in a greater variety of situations.

In hopes of moving beyond ad hoc gesture recognition method, we are building a generalized framework for filtering IMU data and using it for gesture recognition.  The initial design will be based on Kalman filter for data filtering and a modified hidden Markov model scheme for interpretation.  The goal is that regardless of the implementation, a user should be able to build, with reasonable ease (hopefully in a scripting language), an appropriate analysis system in a reasonable time.

Also, we will examine feedback schemes from this device.  Given its size and potential applications, the initial assumption is that we will be limited to simple sounds and/or flashing lights with which to communicate to the user, making it non-trivial to convey a reasonable range of information to the user.  We require that the feedback mechanisms be on the interface, in hopes of narrowing the gap between an event and user feedback, which will contribute greatly to the success of reinforcement learning schemes.
 

The 3X Hardware



Hardware block diagram

The hardware goals for this project are:

While there are many three axis inertial measurement units on the market, most of them fail to meet our requirements.  Many of them, such as the Crossbow Technologies DMU-6X, are both far too large (10 in3) and far too expensive (US$2500) for our purposes.  Systems such as the Ascension Bird (which gives position and orientation instead of acceleration and rotation) do meet both our size and cost requirements.  However, the closed loop nature of its sensing scheme requires a wired link, which is unacceptable.  Also, most commercial systems do not allow user access to the underlying microcontrollers and software, which is of benefit to our research.

Therefore, we decided to construct our own proprietary system.  The final design, shown at the top, is a cube 1.25” on a side (approximately the size of a half-dollar).  Note that the hardware is small enough that it may be easily adapted to a multitude of  interfaces. The block diagram for this hardware can be seen above. Two sides of the cube contain the inertial sensors. Rotation was detected with single axis Murata ENC03J piezoelectic gyroscopes .  The acceleration is measured with two-axis Analog Devices ADXL202 MEMS accelerometers .  The sensor data are inputted into an Analog Devices ADuC812 microcontroller (on the third pane) with 12 bit analog to digital converter (ADC) and an 8051 microprocessor core.  Gyroscope data are collected using the ADC, while the accelerometer data are collected via timing measurements. The raw sensor values are transmitted using an RF Monolithics module to a separate base station, which then connected to the gesture recognition hardware via a serial link.  A separate frequency channel is used for the left and right buns, 916.5 and 315.0 MHz respectively.  Note that the microprocessor can be programmed in-place, allowing for easy development.

In addition, there is a single wide (1” sq.) electrode and driving circuitry provided for external, non-contact proximity signaling, in the same vein as the Personal Area Network.  A kHz range square wave driven into this signaling device will capacitively couple into a nearby electrode, with the magnitude of the signal depending on the inverse of the separation.  Since the frequency of the driving signal is known, the received signal can be bandpassed to allow for very high signal to noise ratios.  A receiving board, with eigth electrode inputs and two different bandpass filters,  was constructed and connects to the gesture recognition computer via a serial link. Since the buns operate as a pair, only one of the two requires a transmitting electrode.

The complete system draws 23 mA while operational, and runs for about 50 hours on two Lithium Manganese batteries (from Tadiran Inc.) placed in parallel.  These batteries are also small enough to fit inside of the cube formed by the hardware. Additionally, a power monitor on the microcontroller is used to monitor the batteries for failure.

The net cost of the system, in prototype quantities, is approximately US$300.
 

Current Uses

The hardware is currently in use in the Sythetic Characters' (void*) project.  The 3X was embedded in a set of bread rolls with forks stuck in the end to mimic legs (as in Charlie Chaplin's The Gold Rush).  This interface is used to control the legs of one of three archetypical characters in a virtual world.  The user can perform a number of gestures (kicks, splits, twists, etc) that are detected by the hardware and the filtered through a gesture recognition system.  Further, using the PAN system, the user can select which one of the three characters to control simply by placing the buns and forks down on an appropriate plate (with a receive electrode underneath) that is part of the physical set of the demonstration.
 

Current Direction

The next goal that we are working toward is setting up a flexible, general Kalman filter for this hardware.  Issue include data representation (e.g. quaternions vs. Euler angles) and inclusion/exclusion of a variety of scaling factors.  A demo of the filtering in action should be posted soon.