Emotion Transfer for 3D Hand Motion using StarGAN

Jacky C. P. Chan, Ana-Sabina Irimia, Edmond S. L. Ho

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Citations (Scopus)
118 Downloads (Pure)


In this paper, we propose a new data-driven framework for 3D hand motion emotion transfer. Specifically, we first capture high-quality hand motion using VR gloves. The hand motion data is then annotated with the emotion type and converted to images to facilitate the motion synthesis process and the new dataset will be available to the public. To the best of our knowledge, this is the first public dataset with annotated hand motions. We further formulate the emotion transfer for 3D hand motion as an Image-to-Image translation problem, and it is done by adapting the StarGAN framework. Our new framework is able to synthesize new motions, given target emotion type and an unseen input motion. Experimental results show that our framework can produce high quality and consistent hand motions.
Original languageEnglish
Title of host publicationComputer Graphics & Visual Computing (CGVC) 2020
EditorsPanagiotis D. Ritsos, Kai Xu
Place of PublicationGeneve
PublisherThe Eurographics Association
Number of pages9
ISBN (Print)9783038681229
Publication statusPublished - 2020
EventCGVC 2020: 38th Computer Graphics & Visual Computing Conference - King's College London, London, United Kingdom
Duration: 10 Sept 202011 Sept 2020


ConferenceCGVC 2020
Country/TerritoryUnited Kingdom
Internet address


Dive into the research topics of 'Emotion Transfer for 3D Hand Motion using StarGAN'. Together they form a unique fingerprint.

Cite this