Context-Aware Computing

(or, why context needs wearables and wearables need context)

Bradley Rhodes

Most desktop computer applications have an explicite user interface that expects you to specifiy exactly what you want the computer to do. Wearable computers will certainly be able to run the standard desktop applications like wordprocessing, spreadsheets, and databases, but to expect these to be the primary applications for wearables is to ignore the vast potential for wearables to be more than simply highly-portable computers. In short, it is making the same mistake people made when they looked at the first PCs from the perspective of mainframe computers and assumed they would be used to keep rescipe databases in the kitchen.

Unlike desktop computers, wearable computers have the potential to ``see'' as the user sees, ``hear'' as the user hears, and experience the life of the user in a ``first-person'' sense. They can sense the user's physical environment much more completely than previously possible, and in many more situations. This makes them excellent platforms for applications where the computer is working even when you aren't giving explicit commands. Health monitors, communications systems, just-in-time information systems, and applications that control realworld devices for you are all examples of these contextually aware / agent applications. Wearables also need these new kinds of applications more than desktop computers do. When sitting at a desktop computer you can expect your user to be interacting with the screen directly. The user's primary task is working with the computer. With wearables, most of the time the user is doing something besides interacting with the computer. They might be crossing the street, or enganged in conversation, or fixing a boeing 777 jet engine. In most cases the wearable is there in a support role at best, and may even be an active distraction from the user's primary task. In these situations the computer can't rely on the user to tell it everything to do, and so it needs information from the wearer's environment. For example, imagine an interface which is aware of the user's location: while being in the subway, the system might alert him with a spoken summary of an e-mail. However, during a conversation the wearable computer may present the name of a potential caller unobtrusively in the user's head-up display, or simply forward the call to voicemail.

MIT Papers on Context Awareness in Wearable Computing

Below is a list of papers and projects here at the MIT Media Lab on contextually aware applications for wearable computers. Readers are invited to contact the author of individual papers for more information.