The aim of this project is to build a database of natural speech showing a range of affective variability. It is an extension of our ongoing research focused on building models for automatic detection of affect in speech. At a very basic level, training such systems requires a large corpus of speech containing a range of emotional vocal variation. A traditional approach to this research has been to assemble databases where actors have provided the affective variation on demand. However, this method often results in unnatural sounding speech and/or exaggerated expressions. We have developed a prototype of an interactive system that guides a user through a question and answer session. Without any rehearsals or scripts, the user navigates through touch and spoken language an interface guided by embodied conversational agents which prompt the user to speak about an emotional experience. Some of the issues we are addressing include the design of the text and character behavior (including speech and gesture) so as to obtain a convincing and disclosing interaction with the user.