TY - GEN
T1 - Towards Deep Learning Based Robot Automatic Choreography System
AU - Wu, Ruiqi
AU - Peng, Wenyao
AU - Zhou, Changle
AU - Chao, Fei
AU - Yang, Longzhi
AU - Lin, Chih Min
AU - Shang, Changjing
PY - 2019/1/1
Y1 - 2019/1/1
N2 - It is a challenge task to enable a robot to dance according to different types of music. However, two problems have not been well resolved yet: (1) how to assign a dance to a certain type of music, and (2) how to ensure a dancing robot to keep in balance. To tackle these challenges, a robot automatic choreography system based on the deep learning technology is introduced in this paper. First, two deep learning neural network models are built to convert local and global features of music to corresponding features of dance, respectively. Then, an action graph is built based on the collected dance segments; the main function of the action graph is to generate a complete dance sequence based on the dance features generated by the two deep learning models. Finally, the generated dance sequence is performed by a humanoid robot. The experimental results shows that, according to the input music, the proposed model can successfully generate dance sequences that match the input music; also, the robot can maintain its balance while it is dancing. In addition, compared with the dance sequences in the training dataset, the dance sequences generated by the model has reached the level of artificial choreography in both diversity and innovation. Therefore, this method provides a promising solution for robotic choreography automation and design assistance.
AB - It is a challenge task to enable a robot to dance according to different types of music. However, two problems have not been well resolved yet: (1) how to assign a dance to a certain type of music, and (2) how to ensure a dancing robot to keep in balance. To tackle these challenges, a robot automatic choreography system based on the deep learning technology is introduced in this paper. First, two deep learning neural network models are built to convert local and global features of music to corresponding features of dance, respectively. Then, an action graph is built based on the collected dance segments; the main function of the action graph is to generate a complete dance sequence based on the dance features generated by the two deep learning models. Finally, the generated dance sequence is performed by a humanoid robot. The experimental results shows that, according to the input music, the proposed model can successfully generate dance sequences that match the input music; also, the robot can maintain its balance while it is dancing. In addition, compared with the dance sequences in the training dataset, the dance sequences generated by the model has reached the level of artificial choreography in both diversity and innovation. Therefore, this method provides a promising solution for robotic choreography automation and design assistance.
KW - Action graph
KW - Deep learning
KW - Gesture relation
KW - Motion planning
KW - Robot dance
UR - http://www.scopus.com/inward/record.url?scp=85070526727&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-27538-9_54
DO - 10.1007/978-3-030-27538-9_54
M3 - Conference contribution
AN - SCOPUS:85070526727
SN - 9783030275372
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 629
EP - 640
BT - Intelligent Robotics and Applications - 12th International Conference, ICIRA 2019, Proceedings
A2 - Yu, Haibin
A2 - Liu, Jinguo
A2 - Liu, Lianqing
A2 - Liu, Yuwang
A2 - Ju, Zhaojie
A2 - Zhou, Dalin
PB - Springer
T2 - 12th International Conference on Intelligent Robotics and Applications, ICIRA 2019
Y2 - 8 August 2019 through 11 August 2019
ER -