Motion synthesis for synchronizing with streaming music by segment-based search on metadata motion graphs

Multimedia and Expo(2011)

Cited 11|Views0
No score
Abstract
Music and dance are two major forms of entertainment in our daily life. Moreover, the fact that people dance to music suggests the possibility of synchronizing human motion with music. In this paper, we present a novel system to automatically synthesize human motion that is synchronized with streaming music using both rhythm and intensity features. In our system, a motion capture database is re-organized into a novel graph-based representation with metadata (called metadata motion graphs) beforehand, which is specially designed for the streaming application. When receiving a certain amount of music data as a segment, our system will search a best path for the segment on a metadata motion graph. This approach, whose effectiveness is demonstrated in a user study, can compose motions segment by segment, which (1) are synchronized with the music at a beat level in a short enough period, (2) are connected seamlessly with the previous segment, and (3) have the necessary synchronization capacity for the remaining music no matter how long it is.
More
Translated text
Key words
music data,motion synthesis,metadata motion graph,novel system,numerical optimization,people dance,segment-based search,music synchronization,remaining music,previous segment,motion capture database,motions segment,motion capture,human motion,novel graph-based representation,rhythm,synchronization,databases,animation
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined