Abstract
We propose a new data-driven framework for synthesizing hand motion at different emotion levels. Specifically, we first capture high-quality hand motion using VR gloves. The hand motion data is then annotated with the emotion type and a latent space is constructed from the motions to facilitate the motion synthesis process. By interpolating the latent representation of the hand motion, new hand animation with different levels of emotion strength can be generated. Experimental results show that our framework can produce smooth and consistent hand motions at an interactive rate.
Original language | English |
---|---|
Title of host publication | Proceedings - MIG 2019: ACM Conference on Motion, Interaction, and Games |
Subtitle of host publication | Newcastle upon Tyne, England, October 28-30, 2019 |
Editors | Hubert P. H. Shum, Edmond S. L. Ho, Marie-Paule Cani, Tiberiu Popa, Daniel Holden, He Wang |
Place of Publication | New York |
Publisher | ACM |
ISBN (Electronic) | 9781450369947 |
DOIs | |
Publication status | Published - 28 Oct 2019 |
Event | MIG 2019: 12th annual ACM/SIGGRAPH conference on Motion, Interaction and Games - Northumbria University, Newcastle upon Tyne, United Kingdom Duration: 28 Oct 2019 → 30 Oct 2019 http://www.mig2019.website/index.html |
Conference
Conference | MIG 2019 |
---|---|
Country/Territory | United Kingdom |
City | Newcastle upon Tyne |
Period | 28/10/19 → 30/10/19 |
Internet address |
Keywords
- hand animation
- emotion
- motion capture
- style transfer