Kristian Nymoen

Learn More
This paper investigates differences in the gestures people relate to pitched and non-pitched sounds respectively. An experiment has been carried out where participants were asked to move a rod in the air, pretending that moving it would create the sound they heard. By applying and interpreting the results from Canonical Correlation Analysis we are able to(More)
The MYO armband from Thalmic Labs is a complete and wireless motion and muscle sensing platform. This paper evaluates the armband's sensors and its potential for NIME applications. This is followed by a presentation of the prototype instrument MuMYO. We conclude that, despite some shortcomings, the armband has potential of becoming a new " standard "(More)
In this paper we present a method for studying relationships between features of sound and features of movement. The method has been tested by carrying out an experiment with people moving an object in space along with short sounds. 3D position data of the object was recorded and several features were calculated from each of the recordings. These features(More)
The paper presents an analysis of the quality of motion data from an iPod Touch (4th gen.). Acceleration and orientation data derived from internal sensors of an iPod is compared to data from a high end optical infrared marker-based motion capture system (Qualisys) in terms of latency, jitter, accuracy and precision. We identify some rotational drift in the(More)
Links between music and body motion can be studied through experiments called <i>sound-tracing</i>. One of the main challenges in such research is to develop robust analysis techniques that are able to deal with the multidimensional data that musical sound and body motion present. The article evaluates four different analysis methods applied to an(More)
In this paper we present results from an experiment in which infrared motion capture technology was used to record participants' movement in syn-chrony to different rhythms and different sounds. The purpose was to determine the effects of the sounds' spectral and temporal features on synchronization and gesture characteristics. In particular, we focused on(More)
The paper presents the interactive music system SoloJam, which allows a group of participants with little or no musical training to effectively play together in a " band-like " setting. It allows the participants to take turns playing solos made up of rhythmic pattern sequences. We specify the issue at hand for allowing such participation as being the(More)
The paper presents the SoundSaber-a musical instrument based on motion capture technology. We present technical details of the instrument and discuss the design development process. The SoundSaber may be used as an example of how high-fidelity motion capture equipment can be used for prototyping musical instruments, and we illustrate this with an example of(More)
The paper presents a conceptual overview of how optical infrared marker based motion capture systems (IrMoCap) can be used in musical interaction. First we present a review of related work of using IrMoCap for musical control. This is followed by a discussion of possible features which can be exploited. Finally, the question of mapping movement features to(More)