Learn More
We present a general methodology to produce synchronized soundtracks for animations. A sound world is modeled by associating a characteristic sound for each object in a scene. These sounds can be generated from a behavioral or physically-based simulation. Collision sounds can be computed from vibrational response of elastic bodies to the collision impulse.(More)
In this paper, we introduce and analyze four gesture-controlled musical instruments. We briefly discuss the test platform designed to allow for rapid experimentation of new interfaces and control mappings. We describe our design experiences and discuss the effects of system features such as latency, resolution and lack of tactile feedback. The instruments(More)
We represent sound signals as general functional compositions, called "Timbre Trees". Externally these are LISP-like expressions, internally they are implemented as C++ data structures. Nodes of the tree can be arithmetic operations, analytic functions or noise generators. Vectorized operations are provided for compact expression of additive spectral(More)
Actions performed by a virtual character can be controlled with verbal commands such as ‘walk five steps forward’. Similar control of the motion style, meaning how the actions are performed, is complicated by the ambiguity of describing individual motions with phrases such as ‘aggressive walking’. In this paper, we present a method for controlling motion(More)
We report results of an auditory navigation experiment. In auditory navigation sound is employed as a navigational aid in a virtual environment. In our experiment, the test task was to find a sound source in a dynamic virtual acoustic environment. In dynamic auralization the movements of the subject are taken into account in acoustic modeling of the room.(More)