TY - JOUR
T1 - Example-based automatic music-driven conventional dance motion synthesis
AU - Fan, Rukun
AU - Xu, Songhua
AU - Geng, Weidong
N1 - Funding Information:
This work was partly supported by NSFC 60633070, 60773183 and 60903132, National 863 High-Tech Program (Grant no: 2006AA01Z313 and 2006AA01Z335), and National Key Technology R&D Program of China (Grant no: 2007BAH11B02 and 2007BAH11B03). It is also supported by NCET-07-0743, and PCSIRT 0652. S. Xu performed this research as a Eugene P. Wigner Fellow and staff member at the Oak Ridge National Laboratory, managed by UT-Battelle, LLC, for the US Department of Energy under Contract DE-AC05-00OR22725.
PY - 2012
Y1 - 2012
N2 - We introduce a novel method for synthesizing dance motions that follow the emotions and contents of a piece of music. Our method employs a learning-based approach to model the music to motion mapping relationship embodied in example dance motions along with those motions' accompanying background music. A key step in our method is to train a music to motion matching quality rating function through learning the music to motion mapping relationship exhibited in synchronized music and dance motion data, which were captured from professional human dance performance. To generate an optimal sequence of dance motion segments to match with a piece of music, we introduce a constraint-based dynamic programming procedure. This procedure considers both music to motion matching quality and visual smoothness of a resultant dance motion sequence. We also introduce a two-way evaluation strategy, coupled with a GPU-based implementation, through which we can execute the dynamic programming process in parallel, resulting in significant speedup. To evaluate the effectiveness of our method, we quantitatively compare the dance motions synthesized by our method with motion synthesis results by several peer methods using the motions captured from professional human dancers' performance as the gold standard. We also conducted several medium-scale user studies to explore how perceptually our dance motion synthesis method can outperform existing methods in synthesizing dance motions to match with a piece of music. These user studies produced very positive results on our music-driven dance motion synthesis experiments for several Asian dance genres, confirming the advantages of our method.
AB - We introduce a novel method for synthesizing dance motions that follow the emotions and contents of a piece of music. Our method employs a learning-based approach to model the music to motion mapping relationship embodied in example dance motions along with those motions' accompanying background music. A key step in our method is to train a music to motion matching quality rating function through learning the music to motion mapping relationship exhibited in synchronized music and dance motion data, which were captured from professional human dance performance. To generate an optimal sequence of dance motion segments to match with a piece of music, we introduce a constraint-based dynamic programming procedure. This procedure considers both music to motion matching quality and visual smoothness of a resultant dance motion sequence. We also introduce a two-way evaluation strategy, coupled with a GPU-based implementation, through which we can execute the dynamic programming process in parallel, resulting in significant speedup. To evaluate the effectiveness of our method, we quantitatively compare the dance motions synthesized by our method with motion synthesis results by several peer methods using the motions captured from professional human dancers' performance as the gold standard. We also conducted several medium-scale user studies to explore how perceptually our dance motion synthesis method can outperform existing methods in synthesizing dance motions to match with a piece of music. These user studies produced very positive results on our music-driven dance motion synthesis experiments for several Asian dance genres, confirming the advantages of our method.
KW - Dance motion and music mapping relationship
KW - learning-based dance motion synthesis
KW - music-driven dance motion synthesis
UR - http://www.scopus.com/inward/record.url?scp=84855950191&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84855950191&partnerID=8YFLogxK
U2 - 10.1109/TVCG.2011.73
DO - 10.1109/TVCG.2011.73
M3 - Article
C2 - 21519104
AN - SCOPUS:84855950191
SN - 1077-2626
VL - 18
SP - 501
EP - 515
JO - IEEE Transactions on Visualization and Computer Graphics
JF - IEEE Transactions on Visualization and Computer Graphics
IS - 3
M1 - 5753889
ER -