Christian Frisson

Learn More
The Video Browser Showdown evaluates the performance of exploratory video search tools on a common data set in a common environment and in presence of the audience. The main goal of this competition is to enable researchers in the field of interactive video search to directly compare their tools at work. In this paper, we present results from the second(More)
This project aims at studying how recent interactive and interactions technologies would help extend how we play the guitar, thus defining the " multimodal guitar ". Our contributions target three main axes: audio analysis, gestural control and audio synthesis. For this purpose, we designed and developed a freely-available toolbox for augmented guitar(More)
This paper presents the development of rapid and reusable gestural interface prototypes for navigation by similarity in an audio database and for sound manipulation, using the AudioCycle application. For this purpose, we propose and follow guidelines for rapid prototyping that we apply using the PureData visual programming environment. We have mainly(More)
The papers at this Convention have been selected on the basis of a submitted abstract and extended precis that have been peer reviewed by at least two qualified anonymous reviewers. This convention paper has been reproduced from the author's advance manuscript, without editing, corrections, or consideration by the Review Board. The AES takes no(More)
This paper presents AudioCycle, a prototype application for browsing through music loop libraries. AudioCycle provides the user with a graphical view where the audio extracts are visualized and organized according to their similarity in terms of musical properties, such as timbre, harmony , and rhythm. The user is able to navigate in this visual(More)
This project presents a new approach to sound composition for soundtrack composers and sound designers. We propose a tool for usable sound manipulation and composition that targets sound variety and expressive rendering of the composition. We first automatically segment audio recordings into atomic grains which are displayed on our navigation tool according(More)
Within the numediart/HyForge research program, this project aims at studying techniques and developing software prototypes for analyzing the structure of music contents and extracting summary excerpts. Several acoustic features are proposed to describe music signals, namely timbral, harmonic and rhythmic features. Based on these features, a method is(More)
This paper presents the LoopJam installation which allows participants to interact with a sound map using a 3D computer vision tracking system. The sound map results from similarity-based clustering of sounds. The playback of these sounds is controlled by the positions or gestures of participants tracked with a Kinect depth-sensing camera. The beat-inclined(More)
While an extensive palette of sound and visual generation techniques have been developed during the era of digital signal processing , the design of innovative virtual instruments has come to dramatic fruition over the last decade. The use of measured biological signals to drive these instruments proposes some new and powerful tools for clinical, scientific(More)
VideoCycle is a candidate application for this second Video Browser Showdown challenge. VideoCycle allows interactive intra-video and inter-shot navigation with dedicated gestural controllers. MediaCy-cle, the framework it is built upon, provides media organization by similarity , with a modular architecture enabling most of its workflow to be performed by(More)