Thesis

OnObject: Programming Physical Objects for Gestural Interaction

Chung, K. "OnObject: Programming Physical Objects for Gestural Interaction"

Abstract

Tangible User Interfaces (TUIs) have fueled our imagination about the future of computational user experience by coupling physical objects and activities with digital information. Despite their conceptual popularity, TUIs are still difficult and time-consuming to construct, requiring custom hardware assembly and software programming by skilled individuals. This limitation makes it impossible for end users and designers to interactively build TUIs that suit their context or embody their creative expression. 

OnObject enables novice end users to turn everyday objects into gestural interfaces through the simple act of tagging. Wearing a sensing device, a user adds a behavior to a tagged object by grabbing the object, demonstrating a trigger gesture, and specifying a desired response. Following this simple Tag-Gesture-Response programming grammar, novice end users are able to transform mundane objects into gestural interfaces in 30 seconds or less. Instead of being exposed to low-level development tasks, users are can focus on creating an enjoyable mapping between gestures and media responses. The design of OnObject introduces a novel class of Human-Computer Interaction (HCI): gestural programming of situated physical objects.

This thesis first outlines the research challenge and the proposed solution. It then surveys related work to identify the inspirations and differentiations from existing HCI and design research. Next, it describes the sensing and programming hardware and gesture event server architecture. Finally, it introduces a set of applications created with OnObject and gives observations from user participated sessions.

Related Content