Projects per year
Abstract
In recent years, thanks to the development of 3DCG animation editing tools (e.g. MikuMikuDance), a lot of 3D character dance animation movies are created by amateur users. However it is very difficult to create choreography from scratch without any technical knowledge. Shiratori et al. [2006] produced the dance automatic generation system considering rhythm and intensity of dance motions. However each segment is selected randomly from database, so the generated dance motion has no linguistic or emotional meanings. Takano et al. [2010] produced a human motion generation system considering motion labels. However they use simple motion labels like “running” or “jump”, so they cannot generate motions that express emotions. In reality, professional dancers make choreography based on music features or lyrics in music, and express emotion or how they feel in music. In our work, we aim at generating more emotional dance motion easily. Therefore, we use linguistic information in lyrics, and generate dance motion.
In this paper, we propose the system to generate the sign dance motion from continuous sign language motion based on lyrics of music. This system could help the deaf to listen to music as visualized music application.
Original language | English |
---|---|
Publication status | Published - 24 Jul 2016 |
Event | SIGGRAPH 2016 - 43rd International Conference and Exhibition on Computer Graphics and Interactive Techniques - Anaheim, California Duration: 24 Jul 2016 → … |
Conference
Conference | SIGGRAPH 2016 - 43rd International Conference and Exhibition on Computer Graphics and Interactive Techniques |
---|---|
Period | 24/07/16 → … |
Fingerprint
Dive into the research topics of 'Automatic Dance Generation System Considering Sign Language Information'. Together they form a unique fingerprint.Projects
- 1 Finished
-
Interaction-based Human Motion Analysis
Shum, H. (PI)
Engineering and Physical Sciences Research Council
1/11/14 → 30/04/16
Project: Research