Interactive Sonification for Health and Energy using ChucK and Unity
CoRR(2024)
Abstract
Sonification can provide valuable insights about data but most existing
approaches are not designed to be controlled by the user in an interactive
fashion. Interactions enable the designer of the sonification to more rapidly
experiment with sound design and allow the sonification to be modified in
real-time by interacting with various control parameters. In this paper, we
describe two case studies of interactive sonification that utilize publicly
available datasets that have been described recently in the International
Conference on Auditory Display (ICAD). They are from the health and energy
domains: electroencephalogram (EEG) alpha wave data and air pollutant data
consisting of nitrogen dioxide, sulfur dioxide, carbon monoxide, and ozone. We
show how these sonfications can be recreated to support interaction utilizing a
general interactive sonification framework built using ChucK, Unity, and
Chunity. In addition to supporting typical sonification methods that are common
in existing sonification toolkits, our framework introduces novel methods such
as supporting discrete events, interleaved playback of multiple data streams
for comparison, and using frequency modulation (FM) synthesis in terms of one
data attribute modulating another. We also describe how these new
functionalities can be used to improve the sonification experience of the two
datasets we have investigated.
MoreTranslated text
AI Read Science
Must-Reading Tree
Example
![](https://originalfileserver.aminer.cn/sys/aminer/pubs/mrt_preview.jpeg)
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined