Current day multi-user graphical role playing games provide a rich interaction environment that includes rooms and exterior areas, everyday objects like chairs, doors and chests, possessions, character traits and other players' avatars. All of these can be acted upon by a player, be it through taking direct action on the world or through speaking with other players. We are using a commercial game, Neverwinter Nights ( http://nwn.bioware.com ), that ships with an editor allowing the creation of custom game worlds and has a large and active online player base. We have instrumented the game, so that we can collect not only the text users type, but also their movements and actions, such as item pick-ups and drop-offs, doors opened and levers pulled. Furthermore, the game world can be scanned for object and room locations. The data therefore consist of complete records of the game situation, physical changes to the situation, player actions and player text messages. In addition to the online collection that only includes typed text, we also perform in-lab data collection to record players' time synchronized speech instead of text messages. We are now designing maps for this game that elicit the type of physical, social and planning interactions that let us ground natural language in a rich situational model.