Physical Situation-Aware Language Generation

This project demonstrates a language generation system that is aware of the location of the human listener. As the listener moves around, a vision system tracks his/her location and uses this information to contextualize descriptions of objects in the environment. This work may be applied to situated communication systems problems such as car navigation systems and wearable computers.