Exploration on affect sensing from improvisational interaction

Li Zhang

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

1 Citation (Scopus)

Abstract

We report work on adding an improvisational AI actor to an existing virtual improvisational environment, a text-based software system for dramatic improvisation in simple virtual scenarios, for use primarily in learning contexts. The improvisational AI actor has an affect-detection component, which is aimed at detecting affective aspects (concerning emotions, moods, value judgments, etc.) of human-controlled characters' textual "speeches". The AI actor will also make an appropriate response based on this affective understanding, which intends to stimulate the improvisation. The work also accompanies basic research into how affect is conveyed linguistically. A distinctive feature of the project is a focus on the metaphorical ways in which affect is conveyed. Moreover, we have also introduced affect detection using context profiles. Finally, we have reported user testing conducted for the improvisational AI actor and evaluation results of the affect detection component. Our work contributes to the conference themes on affective user interfaces, affect inspired agent and improvisational or dramatic interaction.
Original languageEnglish
Title of host publicationProceedings of the 10th International Conference on Intelligent Virtual Agents
EditorsJan Allbeck, Norman Badler, Timothy Bickmore, Catherine Pelachaud, Alla Safonova
Place of PublicationLondon
PublisherSpringer
Pages385-391
ISBN (Print)978-3642158919
Publication statusPublished - 2010
EventIVA'10 Proceedings of the 10th international conference on Intelligent virtual agents -
Duration: 1 Jan 2010 → …

Conference

ConferenceIVA'10 Proceedings of the 10th international conference on Intelligent virtual agents
Period1/01/10 → …

Fingerprint

Dive into the research topics of 'Exploration on affect sensing from improvisational interaction'. Together they form a unique fingerprint.

Cite this