Project

Goal-Oriented Interfaces for Mobile Phones

Currently each app lives in its own little world, with its own interface. Apps are usually unable to communicate with each other and unable to cooperate to meet users' needs. This project intends to enable end-users to "program" their phones using natural language and speech recognition to perform complex tasks. A user, for example, could say: "Send the song I play most often to Bill." The phone should realize that an MP3 player holds songs; that the MP3 app has a function to order songs by play frequency; how to send a file to another user; and how to look up the user's contact information. We use state-of-the art natural language understanding, common-sense reasoning, and a partial-order planner.

Currently each app lives in its own little world, with its own interface. Apps are usually unable to communicate with each other and unable to cooperate to meet users' needs. This project intends to enable end-users to "program" their phones using natural language and speech recognition to perform complex tasks. A user, for example, could say: "Send the song I play most often to Bill." The phone should realize that an MP3 player holds songs; that the MP3 app has a function to order songs by play frequency; how to send a file to another user; and how to look up the user's contact information. We use state-of-the art natural language understanding, common-sense reasoning, and a partial-order planner.