Learn More
This paper reviews models of the ways in which performer instrumental actions can be linked to sound synthesis parameters. We analyse available literature on both acoustical instrument simulation and mapping of input devices to sound synthesis in general human-computer interaction. We further demonstrate why a more complex mapping strategy is required to(More)
This paper describes recent work which challenges the predominance of the WIMP (Windows-Icons-MenusPointers) computer interface for use in real-time situations. The results of the work have implications for the design of user-interfaces for real-time control tasks (of which musical performance and experimentation are clear examples). This paper describes(More)
This paper describes work-in-progress on an Interactive Sonification Toolkit which has been developed in order to aid the analysis of general data sets. The toolkit allows the designer to process and scale data sets, then rapidly change the sonification method. The human user can then interact with the data in a fluid manner, continually controlling the(More)
This paper argues for a special focus on the use of dynamic human interaction to explore datasets while they are being transformed into sound. We describe why this is a special case of both human computer interaction (HCI) techniques and sonification methods. Humans are adapted for interacting with their physical environment and making continuous use of all(More)
This paper describes the sonification of electromyographic (EMG) data and an experiment that was conducted to verify its efficacy as an auditory display of the data. A real-time auditory display for EMG has two main advantages over the graphical representation: it frees the eyes of the analyst, or the physiotherapist, and it can be heard by the patient too(More)