DYCI2 agents: merging the "free", "reactive", and "scenario-based" music generation paradigms

international computer music conference(2017)

Cited 23|Views5
No score
Abstract
The collaborative research and development project DYCI2, Creative Dynamics of Improvised Interaction, focuses on conceiving, adapting, and bringing into play efficient models of artificial listening, learning, interaction, and generation of musical contents. It aims at developing creative and autonomous digital musical agents able to take part in various human projects in an interactive and artistically credible way; and, in the end, at contributing to the perceptive and communicational skills of embedded artificial intelligence. The concerned areas are live performance, production, pedagogy, and active listening. This paper gives an overview focusing on one of the three main research issues of this project: conceiving multi-agent architectures and models of knowledge and decision in order to explore scenarios of music co-improvisation involving human and digital agents. The objective is to merge the usually exclusive free , reactive, and scenario-based paradigms in interactive music generation to adapt to a wide range of musical contexts involving hybrid temporality and multimodal interactions.
More
Translated text
Key words
music,reactive,dyc12 agents,generation,paradigms,scenario-based
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined