This study proposes to use the analysis of physiological signals (elec-troencephalogram (EEG), electromyogram (EMG), heart beats etc) to control sound synthesis algorithms in order to build biologically driven musical instruments. A real time music synthesis environment and algorithms are developed to map these signals into sound. Finally, a " bio-orchestra… (More)
This paper describes <i>HUM</i>, an interactive art installation which interprets the behavior of the visitors on different time scales to render visual and sonic artwork in real-time. <i>HUM</i> was presented at BRASS cultural center (Brussels, Belgium) in May 2009.
In this paper, we introduce some exploratory ideas and applications involving the gestural control of sonic textures. Three examples of how the gestural control of synthesized textures can be implemented are presented: scratching textures, based on the gesturalized exploration of a visual space; dynamic noise filtering, where gestures influence a virtual… (More)