[MIT Media Laboratory Banner]

Inexpensive "Giveaway" Sensors for Large-Crowd Interaction


300 sensor packages used in our tests


Block diagram of interactive music system

One of the major challenges in interactive media is to find a way to build efficient and causal interactive environments through which large numbers of people can simultaneously participate. Accordingly, we have developed a system composed of very low-cost, wireless, wearable sensors that enable a large group of people (e.g., tens to hundreds to thousands) to participate in an interactive musical performance. Unlike previous vision-based approaches (e.g., Loren Carpenter's Cinematrix), this system does not require a clear line-of-sight. The sensors themselves are straightforward circuits composed of simple piezoelectric accelerometers (a PVDF cantilever weighted with a proof mass) that detect sharp changes of acceleration, upon which they transmit a narrow 50 ms RF pulse (a 100 ms deadtimer prevents multipulsing). The only continually powered active component is a dual CMOS monostable multivibrator, hence a simple coin cell battery lasts for many years of normal use (e.g., several hours each week). The circuitry is extremely minimal, hence the devices can be fabricated for much less than $1. in large quantities. The sensor circuit has been encapsulated in a lucite tube to enable it to be easily and robustly handheld. Although one can distinguish between different families of sensors (e.g., fastened to the feet or hands) by using a different carrier frequency, we do not independently ID each performer (the consequent pulse trains would increase the likelihood of collision or require a potentially expensive synchronization protocol), but instead extract statistics from the sensor signals that measure and react to the characteristics of ensemble behavior; as the pulses are so narrow, the probability of overlap within the circa 10 meter range of receiption is slight, even when people are trying to synchronize. This limited RF range implies that several receivers will be required to instrument a large area; a side benefit of this, however, is that the sensor location can be accordingly zoned, resulting in areas of local interaction.

We have built a few hundred of these sensors (see photo above), and developed algorithms that use this system to explore techniques of mapping large-group, real-time musical interaction (see diagram above). Our current software extracts several dynamic parameters from the incoming stream of sensor data, including mean activity (counting pulses arriving across several time scales) ,average tempo (from a Fourier transform or cross-correlation on the lowpass-filtered pulse density), and timing of significant events (indicated by many hit signals arriving within a short period of time). The musical mappings that we have written synchronize to the tempo data (perhaps nudging it upwards by producing music with a slightly faster tempo than what is detected) and exhibit a musical complexity that is dependant on density of pulses, hence the amount of perceived activity (i.e., as people dance more, the music gets wilder).

We have run this system at several MIT dance events. The received data is seen to strongly reflects the activity and state of the participants. For example, this figure shows FFT results for people dancing to nonrhythmic and rhytnmic music; although there is no peak in the former, it shows up clearly in the latter, indicating that the dancers are synchronizing to a definite tempo that corresponds to the BPM (beats-per-minute) of the music being played. If one observes the activity and tempo plots across an hour of dance (with about 20 participants), structure is clearly seen as dancers go through cycles of increased/decreased activity and push the tempo higher when the generated tempo is set to be above the measured tempo.

Although this system was designed for interactive entertainment applications, it has relevance to other fields (e.g., intelligent homes, monitoring the integrity of shipping packages, scattering them to define security perimeters) where ubiquitous and simple "featherweight" wireless sensors can be leveraged.

Quicktime video clips showing the system in action:

Video 1 (2.8 meg): Very simple demonstration: a single sensor creates a direct beat sound

Video 2 (4.2 meg): Conductor demonstration: a single sensor controls a musical stream. As the tempo and density of hits picks up, the generated music increases in complexity.

Video 3 (5.9 meg): Interactive dance event at MIT during 9/02. The first seconds show people simply beating while holding the sensors; a simple percussive sound is fired with heach hit, hence one essentially hears the direct sensor data. Next, dancers are shown responding to the automatically generated music. In this segment, they are riding up the complexity curve; as they dance more, additional voices are added to the music, causing them to dance more, and so on.

Video 4 (3.8 meg): The interactive dance program responding at its highest state of activity.

Video 5 (4.3 meg): The interactive dance program responding at its lowest state of activity.

Video 6 (4.8 meg): Simple demos of the sensor packages being used at CAMP, a project for schoolchildren in Kyoto, Japan, during the summer of 2002. George Lewis and a colleague perform with the sensors at first, then they are given to children to explore. Here, the sensors were tuned to one of 4 different transmit frequencies. Placing 4 correspondingly tuned base stations in the room enabled each family of sensors to be separately identified, allowing functional differentiation (they each make different sounds).

Relevant Papers and Reports:

Comprehensive paper giving details about this system and the musical mappings:

An Interactive Music Environment for Large Groups with Giveaway Wireless Motion Sensors
Feldmeier, M. and Paradiso, J.A.
Computer Music Journal, Vol. 31, No. 1 (Spring 2007), pp. 50-67.

Paper presenting an early system at the Ubicomp 2001 Workshop on Games:

Ultra-Low-Cost Wireless Motion Sensors for Musical Interaction with Very Large Groups Joseph
Paradiso and Mark Feldmeier. Presented at the UBICOMP 2001 Workshop on Designing Ubiquitous
Computing Games, ACM UBICOMP Conference Proceedings, Atlanta GA, Sept. 2001.

Paper describing the system with very early results at the ICMC 2002:

Large Group Musical Interaction using Disposable Wireless Motion Sensors Mark Feldmeier,
Mateusz Malinowski, Joseph A. Paradiso, in the Proceedings of the ICMC 2002 Conference,
International Computer Music Association, San Francisco CA, pp. 83-87, September 2002.

A short paper on this system presented at CHI 2004 :

Giveaway Wireless Sensors for Large-Group Interaction, Feldmeier, M., and Paradiso, J.A., in the Proc. of the ACM Conference on Human Factors and Computing Systems (CHI 2004), Extended Abstracts, Vienna, Austria, April 27-29, 2004, pp. 1291-1292.

Mark Feldmeier's Media Lab MS Thesis:

Mark C. Feldmeier -- Large Group Musical Interaction using Disposable Wireless Motion Sensors (pdf) October 2002.

Josh Randall's AUP project on a lighting controller to use with this system:

Joshua Randall -- Real-time Lighting System for Large Group Interaction (pdf) May 2002.

Acknowledgements:

Thanks to Measurement Specialties (MSI) for donating the piezo sensor elements used in this project.

Return to the Responsive Environments Group Projects Page

 


Mark Feldmeier


Joe Paradiso

Mat Malinowski

Michael Broxton