Multimodal intelligent affect detection with Kinect

Yang Zhang, Li Zhang, Alamgir Hossain

Research output: Contribution to conferencePaperpeer-review

1 Citation (Scopus)

Abstract

Communication between human beings involves complex and rich means. In the past decades, computers have successfully supported human in a variety of tasks such as calculating and memorizing. However, when confronted with the demand of multimodal interaction with users, can these indispensable partners make us satisfied? This research might answer this question.

Original languageEnglish
Pages1461-1462
Number of pages2
Publication statusPublished - 6 May 2013
Event12th International Conference on Autonomous Agents and Multiagent Systems 2013, AAMAS 2013 - Saint Paul, MN, United States
Duration: 6 May 201310 May 2013

Conference

Conference12th International Conference on Autonomous Agents and Multiagent Systems 2013, AAMAS 2013
Country/TerritoryUnited States
CitySaint Paul, MN
Period6/05/1310/05/13

Keywords

  • Affective computing
  • Emotion theory
  • Multimodal affect sensing and analysis

Fingerprint

Dive into the research topics of 'Multimodal intelligent affect detection with Kinect'. Together they form a unique fingerprint.

Cite this