Project

FlexiGesture

Groups

This project aims to create an electronic controller with many different sensor affordances that adapts to the gestural preferences of the user during the course of the interaction. Our research includes an electronic music interface; departing from existing music controllers and most current multimodal interfaces, this system allows users to train the system to recognize their own personalized gestures and to establish the mappings from those gestures to sound. This approach turns the paradigm for musical-instrument design on its head, giving the device the ability to adapt to the player. Viewed as a data-collection platform that will be used by many subjects, the controller will be a powerful vantage point from which to study universal patterns in the way people associate gesture and sound. Beyond breaking ground in electronic musical instrument design, this work is investigating important and topical issues in learning systems for multimodal and adaptive user interfaces.